Posts Tagged ‘Information

08
Sep
08

Gift cards on-demand

Amazon.com, has announced the launch of Amazon Gift Codes On Demand™ (AGC On Demand), a real-time electronic gift-card distribution option available from the Amazon Corporate Gift Card program. According to the company, “AGC On Demand is a simple Web service API that integrates Amazon’s proprietary gift-card technology directly into customer loyalty, employee incentive and payment disbursement platforms. With AGC On Demand, companies are able to reduce physical gift-card fulfillment overhead while providing gift card recipients with a customized experience and instant gratification.

Previously, gift card values were fixed and management of inventory for active gift cards and gift codes purchased in bulk required secure facilities. With AGC On Demand, gift codes are created individually in virtually any denomination and can be immediately issued in almost any format — based on the client’s preference — including e-mail, HTML, customized/co-branded cards and paper receipts.

“This is a great solution for developers and incentive companies who are looking for a more cost-effective way to manage a gift card program,” said Marcell King, senior manager of corporate gift cards with ACI Gift Cards, Inc. “The AGC On Demand service offers a quick and secure way to deliver gift cards and stored value to program participants.”

24
Aug
08

Managing Data in the clouds

Joe Mckendrik has an interesting perspective on this topic:

More companies are emphasizing their ability to compete on analytics, and the ability to integrate and leverage enterprise data is key. Whether on-site or in the cloud, effective data integration is a must.

As cloud computing engagements increase in sophistication and edge ever closer to the mission-critical core of the enterprise, recognition is growing that there are enterprise data management issues that still need to be worked out. “Our belief is that cloud computing or on-demand computing is simply a way of further fragmenting data, because customers are absolving themselves from responsibility for the management, storage, security, and backup and recovery of the availability of that data,” Chris pointed out. However, he emphasized, “you must never, ever, absolve responsibility for the quality and the ownership of the data, and having such quality and ownership as part of your core business processes. And that requires integration.”

As Informatica’s Ron Papas put it, technically, there isn’t a lot of difference between on-site systems and data stores and cloud-managed systems and data stores. However, there’s a big difference in the ownership of these applications:

“What’s that’s doing is it’s bypassing the traditional process of having IT design the whole integration processes into the solution. So, before you know it, you could be up and running with Salesforce.com without having put much thought into integration, because it’s really being led by the line of business side. You could have someone in the sales and marketing unit that somehow bypassed IT and went up and implemented Salesforce. All of a sudden, they realize they need access to that data. they need it synchronized.”

17
Aug
08

Love your data, set it free!

Data services are freeing corporate data from the silos, allowing for its use on demand while providing security to the data’s custodians. The demand for more data more quickly is driving IT departments to rethink their entire systems architectures.

At Cequity, we have been helping clients work within the constraints of multiple -source systems while making data accessible for marketing when they need it. Our philosophy has been to make data more flexible and easy to access so that enterprises can take advantage of huge amounts of data that they accumulate today.

Dana Gardner writes:

In the past, data was structured, secure and tightly controlled. The bad news is that the data was limited by the firewall of personnel, technologies and process rigidity. Today, however, the demand is for just-in-time and inclusive data, moving away from a monolithic data system mentality to multiple sources of data that provide real-time inferences on consumers, activities, events, and transactions.

The move is in the ownership of data value to the very people who really need it, who help define its analysis, and who can best use it for business and consumption advantage. Analysis and productivity  values rule the future of data as services.The [new] model is of keeping the data where it belongs and yet making it available to the rest of the world.Our data is trapped in these silos, where each department owns the data and there is a manual paper process to request a report.

According to  Brad Svee”..Requesting a customer report takes a long time, and what we have been able to do is try to expose that data through Web services using mashup-type UI (user interface) technology and data services to keep the data in the place that it belongs, without having a flat file flying between FTP servers, as you talked about, and start to show people data that they haven’t seen before in an instant, consumable way.”

Read more

19
Jul
08

Taming the data beast

In a world where the quantum of data that’s being captured is increasing leaps and bounds, how do companies make sense of these huge piles of data? It is imperative that data must be converted to simple, easy to interpret visual methods at speed.

In this article, Angela writes:

Moore’s Law has driven quantum leaps in the processing power of software and hardware systems. Organizations have become larger and more complex. Demands for up-to-the-minute access to data have intensified.

he most effective way to tame the data beast is through interactive visualization. Spreadsheets and tabular reports are at their limits. Utilizing visual metaphors allows multiple dimensions of the data to be understood at once. In context, it provides a “narrative” for the data. Interactivity allows the user to engage the data in his or her thinking process, which enables a dynamic dialogue with the data.

By empowering knowledge workers with visual tools and hands-on access to data, they can find patterns, distributions, correlations or anomalies across multiple data types. Users can select data elements, filters, highlighting and display options to change data perspectives – from high-level overviews down to the lowest levels of detail. The visual cues inherent in the software enable a deep exploration and understanding of the data set at hand.

Read more

19
Jul
08

The Value of the Consumer’s Voice

It’s not often uncommon when clients tell us that customers are increasingly not filling-up application/enrollment and feedback forms completely, data about their customers & their profiles are not updated and also information about their behaviour is increasingly becoming scarce & difficult to get, how do companies handle an market environment like this?

We think organizations have to change way they capture or mine data. They need to look at where else can they ‘dialog’ with these customers – Twitter, Facebook, Google Chat etc.? How often have you seen companies even capture  such” Dialog Points” for a possible “information opportunity”?

There is an interesting point of view on the same on this issue. Take a look:

To stay on top of purchase trends or address dissatisfactions, marketing teams often administer surveys and comb through hundreds of thousands of responses to find needles of profundity in the customer service haystack. In reality, if today’s consumer has a gripe or suggestion, he or she no longer fills out a comment card or a survey. Instead, the customer takes it to the keyboard and posts comments online for the whole world to read. Given the influence of word of mouth (or more appropriately, word of “blog”), your business will feel the direct effects of online reviews – positively or negatively.

According to a 2007 Deloitte & Touche study, more than eight in 10 (82 percent) of consumers said their purchasing decisions have been directly influenced by online reviews.1 With the advent of the blogosphere and social-media tools such as Twitter and Facebook, there are more opportunities for customers to voice their opinions, and the circle of influence has grown much wider. Monitoring for these comments must be a core part of your business.

Organizations have a unique opportunity to connect with their customers just by letting them know they are being heard. JetBlue Airways created a Twitter account after learning customers were using this platform to voice their frustrations. Now, Twitter has become a key customer service portal where the airline offers discounts and responds to flyers in real time. Using Twitter, JetBlue has turned around its negative public perception, even in a struggling aviation market.

The Web’s reach is boundless and social-media text is quite unstructured making it impossible for a typical search to uncover everything of relevance. So, how can marketers and executives sift through all of the social-media noise – the ruminations, misfindings and the insignificant rants – to find the true opinions and reviews about their brands?

The answer is semantic search and analysis. Semantic search provides early identification of consumer concerns, suggestions, likes and dislikes and purchasing trends. It uncovers this information from the most unstructured corners of the Web. The retrieval of such information is not limited to recognizing key words as typical Web searches do. Instead, it uncovers the meaning the words express in their proper context and accepted meaning no matter the number (singular or plural), gender (masculine or feminine), verb tense (past, present or future) or mode (indicative or imperative).

06
Jul
08

On-Demand Analytics – Changing the face of retail

Out-of-stock, poorly timed inventory levels and other lost sales opportunities present a problem for retailers.

They frustrate consumers and cost retailers and consumer product companies.

A growing group of companies are targeting this problem. Companies can create a highly sophisticated picture of what’s happening at every step of a product’s journey to the consumer purchase by harnessing the store-level sales data that retailers make available to suppliers and incorporating analysis and other relevant information like weather patterns.

The promise of demand data analytics – the catch-all phrase for such services – is that a better understanding of how, why and when people buy coupled with more knowledge of the supply chain and store-level execution will help stores stay stocked with the right stuff at the right time.

TrueDemand is one of the companies hoping to capitalize on the situation and has established a Bentonville office to better serve its growing Wal-Mart supplier clientele.

The company created software that takes Retail Link data, the sales data that Wal-Mart Stores Inc. provides to its suppliers, and analyzes it on a daily basis. Then, it creates company-specific, daily reports that predict out-of-stocks and prioritize actions at the store level to help prevent them.

Shiloh Software has become a market leader of data mining tools. It takes information from Retail Link and integrates data from a number of other syndicated data providers like NPD Group Inc. and ACNielsen, and other information like weather forecasts, U.S. Census Bureau data and store-level traits like whether it’s by a university or in a largely Hispanic community.

Read more

29
Jun
08

The Data Deluge Makes the Scientific Method Obsolete

Chris Anderson has written a great article in Wired on the data deluge and how it poses new challenges to the companies. He writes that the petabyte age that we live in information is not a matter of simple three- and four-dimensional taxonomy and order but of dimensionally agnostic statistics. For companies, that have or gather loads and loads of data, the implications are about how can they quickly sift thro’ this massive volumes of data and the successful ones will be the ones who can track and measure this with unprecedented precision and scale. Take a look:

Speaking at the O’Reilly Emerging Technology Conference this past March, Peter Norvig, Google’s research director, offered an update to George Box’s maxim: “All models are wrong, and increasingly you can succeed without them.”

This is a world where massive amounts of data and applied mathematics replace every other tool that might be brought to bear. Out with every theory of human behavior, from linguistics to sociology. Forget taxonomy, ontology, and psychology. Who knows why people do what they do? The point is they do it, and we can track and measure it with unprecedented fidelity. With enough data, the numbers speak for themselves.

The big target here isn’t advertising, though. It’s science. The scientific method is built around testable hypotheses. These models, for the most part, are systems visualized in the minds of scientists. The models are then tested, and experiments confirm or falsify theoretical models of how the world works. This is the way science has worked for hundreds of years.

Scientists are trained to recognize that correlation is not causation, that no conclusions should be drawn simply on the basis of correlation between X and Y (it could just be a coincidence). Instead, you must understand the underlying mechanisms that connect the two. Once you have a model, you can connect the data sets with confidence. Data without a model is just noise.

But faced with massive data, this approach to science — hypothesize, model, test — is becoming obsolete. Consider physics: Newtonian models were crude approximations of the truth (wrong at the atomic level, but still useful). A hundred years ago, statistically based quantum mechanics offered a better picture — but quantum mechanics is yet another model, and as such it, too, is flawed, no doubt a caricature of a more complex underlying reality. The reason physics has drifted into theoretical speculation about n-dimensional grand unified models over the past few decades (the “beautiful story” phase of a discipline starved of data) is that we don’t know how to run the experiments that would falsify the hypotheses — the energies are too high, the accelerators too expensive, and so on.

Read more




At Cequity, we believe customer intelligence will be the biggest competitive advantage enterprises will have in the next decade or two. Successful enterprises of tomorrow will be the ones who can organize and leverage this information at speed to optimize their marketing performance, increase accountability, improve profit and deliver growth. Cequity insights will bring to you trends and insights in this area and it’s our way of sharing best practices so as to help you accelerate this culture and thinking in your organization.
July 2017
M T W T F S S
« Sep    
 12
3456789
10111213141516
17181920212223
24252627282930
31  

Blog Stats

  • 15,133 hits