All Posts Tagged Tag: ‘Data’
Identifying Long Running Queries is First Step
At this step in the SQL Server performance audit, you should have identified all the “easy” performance fixes.
A whole bunch of people at work today asked me if I had seen a recent posting on the Google Blog News Channel: What Search Engines Do Search Engine Companies Use?
Application and Transact-SQL Code Greatly Affect SQL Server Performance.
Some interesting data about which search engines are used by search engine employees. Not surprisingly, a lot of Yahoo and Microsoft employees use Google.
Attunity announced a joint Webinar with Microsoft to educate enterprises on the benefits of real-time data integration for business intelligence.
Recently, I was asked to help someone clean up their database after they had double loaded an import file.
SMS for Enterprise Messaging – Value added services
Most interactive websites nowadays require data to be presented dynamically and interactively based on input from the user. For example, a customer may need to log into a retail website to check his purchasing history. In this instance, the website would have stored two types of data in order for the customer to perform the check – the customer’s personal login details; and the customer’s purchased items. This data can be stored in two types of storage – flat files or databases.
Business intelligence solutions narrow the ever widening gap between the amount of data gathered and the methods of being able to understand that data.
What is RAID RECOVERY? RAID stands for Redundant Array of Inexpensive Disks. It is a method of combining several hard drives into one unit. This method offers fault tolerance (the ability of a system to continue to perform functions even when one or more hard disk drives have failed) and higher protection against data loss than a single hard drive.
Missing data can be very annoying to a programmer. In fact, it is so annoying that very often we’ll write separate programs to clean up data and eliminate unpleasant conditions so that the main program doesn’t have to deal with it. Here, I’ll show some examples of the kind of problems we see.
Linux programs can now exchange data in real time with a wide range of Windows programs, across a network or the Internet.
Measuring Potential Value with LifeCycle Metrics
This whole potential value measurement issue is, of course, the big problem embedded in the preaching you hear on LifeTime Value, CRM, and these portfolio models of customer value. How do you deal with this whole “potential value” question, how do you actually measure it and act on it?
Navara released two new products designed to deploy mobile enterprise applications to users of BlackBerry from Research In Motion.
Hosted By Sun Microsystems, Attendees Will Gain Insights On How To Increase Wireless Data Service Revenue With Java.
First Deployment of TechnoCom’s End-to-End Location Solution Completed by Latin American Wireless Operator.
Yosemite Technologies today announced its plans to support Microsoft Data Protection Server (DPS), a continuous, low-cost disk-based backup and recovery solution.
Create!form v6.0 adds powerful new features including document repagination, calculations, ODBC lookups, enhanced print performance and more, to save time and resources.
Cost-Effective Data Management Solution for PeopleSoft Users Addresses Availability and Business Management Needs.
Storage Management Solutions Now Span Industry Leading Operating Systems; Maximizing Choice and Flexibility for Customers While Driving Down Costs.
Industry Partners Demonstrate Broad Support for Microsoft Data Protection Server to Provide Customers With Rapid and Reliable Disk-Based Data Recovery.
Airbee Wireless announced that it has filed a patent application with the U.S. Patent and Trademark Office entitled “System and Method for Data Transmission.”
Oracle today announced a record-breaking 8 processor TPC-H 300 GB data warehousing benchmark result for Oracle Database 10g and Oracle Real Application Clusters on Linux, showing yet again Oracle’s ability to manage data warehouses on low-cost clustered Linux servers.
Up to the version 7.5 Microsoft Great Plains, and if you are looking to version 7.0, 6.0, 5.5, 5.0 or 4.0 – then it was Great Plains Dynamics were available on Pervasive SQL.2000 or Btrieve.
Leaders of Wireless Industry Publish Specifications for Extending Mobile Voice and Data Services Over WLANs
Kineto Wireless announced the company, along with thirteen other leading service providers, infrastructure suppliers and handset manufacturers in the wireless industry, have published a set of open specifications for extending mobile voice and data services over Wireless LANs.
NetLogic Microsystems announced that it is delivering data plane products for Alcatel’s highest-performance multi-service switching and routing product, the 7670 Routing Switching Platform (RSP).
Acxiom Corporation, the global leader in customer data management, today announced that it had been named the 2004 “CRM Data Quality Market Leader” by CRM magazine, the industry’s leading publication dedicated to helping companies become customer-focused organizations.
Track Data today announced that Nasdaq has advised the Company that its common stock has not met the minimum $1.00 per share closing price required to avoid delisting from Nasdaq’s National Market.
Oracle received top honors in two categories in Intelligent Enterprise Magazine’s 2004 Reader’s Choice Awards for Best Information Technology Products.
CNET Networks Taps Tera-Scale Data Warehouse Appliance for Comprehensive, Near Real-Time Analysis of Customer Behavior and Marketing Programs.
Targeting high potential markets with a direct mail marketing campaign can be a very affordable and efficient way to get new customers for most companies and entrepreneurs but how can you find a way to reach those high potential markets? Using demographic data could be your solution.
Looks like Microsoft Great Plains becomes more and more popular, partly because of Microsoft muscles behind it. Now it is targeted to the whole spectrum of horizontal and vertical market clientele. Small companies use Small Business Manager (which is based on the same technology – Great Plains Dexterity dictionary and runtime), Great Plains Standard on MSDE is for small to midsize clients, and then Great Plains serves the rest of the market up to big corporations.
The single most important factor to the success of a CRM implementation is the data migration. To maximize the potential for success, there are three key elements that need close consideration throughout all aspects of the implementation: pre-import data cleansing, data enhancement, and data integration.
When application developers are developing, parameters are often hard-coded in the source code. These hard-coded parameters are often pulled out of the source code and put into property files or configuration files. System and network security policies may force a developer to address security concerns over the data that is stored in external files. So, how do you make sure that your sensitive external parameters are safe?
What are the pitfalls to avoid when starting to measure Web site performance using Web Site Analysis Tools?
Displaying the News Items for a Particular Syndication Feed
The next task that faces us is creating the DisplayNewsItems.aspx Web page. This page should display the titles of the news items in the selected syndication feed as hyperlinks such that when the hyperlink is clicked the description of the news item is shown in the bottom right frame. This task presents us with two primary challenges:
Hello people lets take a look at loadVars.sendAndLoad and what we could do with it today, to read more about it go here:
A well-designed chart can be one of the most persuasive elements of your trade show booth display and literature. It illustrates to your customers why your product is the obvious solution to one of their specific needs. It can communicate major benefits or features more clearly than words can.
Always wanted to build an easy solution to select multiple items from an ASP.NET DataGrid and delete them all at once like hotmail does? Well, it is built in just a few simple steps.
Web development has come a long way since simple script-based Web programming technologies like Microsoft Active Server Pages (ASP). With Microsoft ASP.NET, a lot of the tedious, repetitious coding chores that were commonplace with classic ASP are now a thing of the past. For example, as all one-time classic ASP developers know, displaying data in a classic ASP Web page required the following pseudocode:
Analyzing the DataList
Recall that the DataGrid renders as an HTML
<table> , which each DataSource record as a table row
(<tr>) and each record field as a table column (
<td>). At times you might want more control over the presentation of data. For example, you might want to have the data displayed in an HTML
<table>, but rather than have one record per row, you might want to display five records per row. Alternatively, you might not want to have the data displayed in a
<table> tag at all, but rather have each element displayed in a
Today’s business environment has changed drastically from just a few years back. Rather than working exclusively with equipment, data, and systems, today’s IT managers face issues such as cross training, personnel management, interdepartmental communication, and a widening job scope for all IT employees.
Is Regular Backup Enough?
Most businesses secure their information infrastructure by regularly backing it up onto tape. Some have gone further, enhancing their backup strategy with expensive disk arrays and mirroring. Whether an earthquake, a flood, a blackout or a hard disk failure should catch them by surprise, these backups would ensure the survival of their information. Should human or software error (which account for approximately 40% of all application-related disasters) corrupt their data, they would simply reach for a recent backup, which would help them back on their feet. But would simply having these regular backups stashed away someplace safe be enough?
“Analytics.” The word sounds technical, number-crunchy, maybe even a bit boring. We information architects and user experience folks tend to prefer dealing with the real users, the designs, and the creative expression of our ideas, and not so much with the numbers. We spend our time developing prototypes, testing designs with users, and then interpreting those results for a creative solution that provides outstanding user experiences. But our exposure to the data and measurement end can be limited, or nonexistent.
Your worst nightmare just became a horrifying reality. You keep hearing that little voice in your head mockingly shout “you should have backed that stuff up” The voice keeps echoing throughout your head as you perform a quick inventory all of the important information that you just lost..your client database, a years worth of e-mail, your entire inventory database, even your family photos.
What would happen to your data if: a. you backed your SUV over your laptop? b. your laptop spent two days in a sunken cruise ship submerged at the bottom of the Amazon River? c. firefighters rescued your computer’s scarred carcass from a flaming warehouse?
I have to assume that after reading the title to this week’s article you probably let out a low, discerning groan of displeasure. Of all the networking topics that I’ve ever taught, the OSI model is the one that will generate looks of angst and torture on the faces of students. Some simply tune out and play solitaire, while others shake their heads in the familiar “not this thing again”. Still others pretend that there’s no time like right now to catch a half-hour of shut-eye. One thing that has never happened is someone’s eyes lighting up and them shouting “I love the OSI model!”. The reason for this is simple. This is often the first thing that people are ever taught when it comes to networking, and it was probably explained in such a way that they didn’t understand it to begin with. A theoretical model is a terrible introduction to the world of networking. The good news for me is that most of you probably already have experience in the field, and will hopefully be able to appreciate how important network models really are.
This sample deals with the retrieval of the value of the Autonumber field for a data row inserted in MS Access 2000. SQL Server provides access to new Identity values through SCOPE_IDENTITY, IDENT_CURRENT and @@IDENTITY based on the scope and session boundaries. In Jet 4, Microsoft added support for ANSI-92 SQL syntax, including support for @@IDENTITY. This feature can be very useful in the Internet mode. Typically, you will be able to identify and access rows inserted from Web pages and manipulate the newly added rows.
Managing multiple versions of the same database across different environments (i.e. Development, Test, Production, and Disaster Recovery site) is perhaps one of the most important and least favorable Database Administrative duties. Whether it’s synchronizing database objects or actual data, creating utilities to perform this obligatory task is somewhat of a dreaded chore. I mean, think of it, when was the last time you looked forward to creating one of these utility jobs? My point exactly!
In this example, we will create a windows form application to demonstrate the transfer of data between a windows form and a dialog box displayed from the windows form.
Wireless technology is evolving at a rapid pace. There is a lot of talk about mobile and wireless computing and there is also a fair amount of hype. However, the one thing that is conspicuously absent from much of these discussions on mobile and wireless computing is a discussion on what these devices are connecting to. The fact is, most of the value, in terms of content and capabilities of the device, is a result of interacting with a server of some type. This is true whether we are talking about microbrowsers such as WAP and iMode, J2ME clients, or short message service (SMS) and email. Behind the scenes these devices are interacting with services that reside somewhere on a network. These services handle much of the complex details of the features offered by wireless devices. Although there are complexities that the mobile device must deal with, a well-designed wireless architecture delegates as much complexity as possible to the server. This is desirable because servers have more processing capabilities and do not have the power restrictions of mobile devices (i.e., servers don’t run on batteries that are worn down by the CPU). This article examines wireless computing from the server’s perspective. First, the problems of wireless computing or the public Internet are discussed. Then various models that help address these problems are provided.
Monitoring Windows NT/2000/XP/2003 is important even for small environments. Automatically monitored, critical failures can often be avoided. But how to monitor a system without too much effort? The basic idea behind a successful monitoring and alerting system is to centralize all system events at a single monitoring station. Once the information is centralized, it can be used to build an alerting system or even carry out corrective actions.
Before you choose a hosting plan, there are many things to consider. Two of the most important are the Web Server Space and the Data Transfer Allowance (also called bandwidth) that you will need. Web hosts will usually try to lure you with either a large amount of Web Space or monthly Data Transfer Allowance. Though the best case scenario would be to have plenty of both, most hosts tend to offer more of one and less of the other, so you will have to find the right balance.
As you have worked with SQL Server, you probably have run across the terms data cache and procedure cache, and may have wondered what exactly a cache was. SQL Server is configured to use a physical pool of memory on the server, and it will allocate the majority of this memory pool to hold data pages that have been read, along with the compiled execution plans for all previously-run Transact-SQL statements. It is this dynamic pool of memory that is being referred to by the data cache and procedure cache. Before SQL Server 7.0, the data cache and procedure cache were two separate pools of memory and could be controlled separately. In SQL Server 7.0 and SQL Server 2000, one pool of memory is used both for data and execution plans.
Please note – Data Islands are exclusive to Internet Explorer!
This month we start a two part series on Data Islands. Part 1 explores how we can use Data Islands to embed XML and XSLT into a browser, and manipulate that data using DHTML. Part 2 will illustrate Data Islands and Data Binding, and how to update data from the browser with Web Services and XMLHTTP.
This article is another illustration of why using PHP with XSL to transform XML data to various presentation layers is beneficial. With that being said this article will demonstrate how to present the same data to several different wireless technologies using PHP and XSL, instead of an article completely focused on PHP and XSL.
Suppose you’re writing a query to find all the invoices that were written on January 6, 2003. You know from the control totals that 122 invoices were written that day. But when you run this query:
Paco Underhill’s Why We Buy: The Science Of Shopping (Touchstone Books, 2000) illuminated the mysterious behavior of shoppers wandering around in retail stores.
The realization comes to marketers in a flash that the behavior of a website visitor is immeasurably more measurable. It’s obvious. It’s intuitive. It’s exciting.
Eavesdropping attacks are often easy to launch, but most people don’t worry about them in their applications. Instead, they tend to worry about what malicious things can be done to the machine on which the application is running. Most people are far more worried about active attacks than they are about passive attacks.
Storing and displaying data is a common and essential task, if you are working with Applications. It doesn’t matter whether you are working with desktop Applications or WebApplications.
I have received over 2 dozen calls on Monday, Aug.11, 2003 concerning people with Windows XP that contains an error that states there was in an error in the RPC and the system will reboot.
They say that a picture is worth 1000 words, and in the world of the Internet, where your web host charges you for the amount of data transferred from the server to your customers, that is very literally true. Some site hosts promise unlimited data transfer (at least in theory), but most providers charge huge fees if you exceed some predetermined limit. When pictures take 20k-bites and more, getting a visually interesting site without the a huge data transfer cost requires some planning and some really neat tricks.
When writing scripts, it is extremely important to have to ability to transfer information from one script to another. A common method to do this is with the
GET convention. Search engine Web spiders, however, tend to ignore pages whose URL contains
GET method parameters. If you’re not sure what a
GET method parameter is, here’s an example of a URL with
GET method parameters:
Jeff Prosise has written an article “Currency Converter with ASP.NET Web Forms“, he pretty much explained how to load XML data with ASP.NET from the “Rates.xml” file. In this article I have created Currency Converter Server which can be scheduled to extract the data from third party site and build the “Rates.xml” dynamically.
If you collect and use any kind of data you probably have some kind of organizing system. Whether you use index cards, a filing cabinet, Excel spreadsheets, or some kind of database program, your system should let you add and change data, delete data, and retrieve data, and it should work faster and more efficiently than if you had to do it by hand.