Updates from June, 2015 Toggle Comment Threads | Keyboard Shortcuts

  • admin 9:51 am on June 25, 2015 Permalink
    Tags: , Return, ShowStopper   

    The Return on Analytics: What is the Show-Stopper? 

    A recent Wikibon report found that enterprises are struggling to derive maximum value from Big Data. While they expect a return of around $ 3.50 on the dollar, their return to-date is just $ 0.55. So what stops the business from deriving value from analytics? Based on my 25 years of experience in the Australian analytics industry, I believe that the single biggest show-stopper is the skepticism of analytics consumers regarding the value of analytics and the consequent lack of endorsement for embedding analytics insights in their decision process.

    Let me illustrate with two different analytics projects that I have been involved in, one for a telecommunication company and the other for a wholesale clothing manufacturer.

    Both projects were decision support systems generating upsell leads for consultants selling into a declining corporate market. Each telco consultant looked after a number of corporates within their region, while each wholesale consultant was responsible for selling more products to a single large retailer. In both cases, the consultants’ bonuses were tied to sales and hence there was a compelling business case to use analytics to uplift sales.

    Both projects had access to historical data needed for analysis and could process it successfully. The telco provider had the historical quarterly revenue summary for each customer, while the wholesaler had the weekly sales and stock for each stock item (SKU) at each store (Weekly Point of Sales data).

    Both projects successfully created predictive analytics solutions that provided demonstrable uplift over the existing methods despite requiring significant innovations to overcome hurdles(See KDD 2005 Proceedings). Hence, both got management support with funding for deployment for regular use.

    It is the endorsement from the business users that gets sustained business value from an analytics investment.

    Both deployments had an analytic back-end that generated predictive models based on the most recent data, and a front-end that presented an opportunity list sorted by the size of the opportunity. For telco, the back-end needed a manual start each quarter and the presentation was via Excel spreadsheets. The wholesale solution was fully automated with the back-end developing models from weekly data feeds and the self-serve front-end was directly accessible by the sales team. Both systems also provided explanation of how the opportunities were derived, e.g., Figure 1 shows the sell rate for two similar products P1 and P2, and the fact that despite similar demand P2 is under performing.

    The Telco application has been in use for more than 10 years despite the lack of full automation and continues to provide leads for the sales consultants to follow up each quarter. On the other hand, the use of the automated wholesale application has been patchy. The few consultants who use the application attribute more than 10% of their sales to the opportunities identified by analytics yet others continue to ignore these opportunities.

    The main difference is how the system has been embraced by the consumers of the analytics insights. The telco sales consultants believe the predictions, find them useful and continue to train new staff so the use of analytics is embedded in their sales process. In the wholesale business, there is skepticism about analytics among a large group of the users. Hence, despite the analytics system still generating great opportunities from the automated data feeds, it continues to be ignored by them. Only early adopters use the system and get the sales uplift.

    These two analytics projects are not isolated examples, and operational analytics insights, whether deployed manually or automatically, generate substantial business value only when their consumers believe in the analytics insights and adopt its use. And while C-suite support may be necessary to get analytics off the ground, it is the endorsement from the business users that gets sustained business value from an analytics investment.

    Bhavani Raskutti is the Domain Lead for Advanced Analytics Teradata ANZ . She is responsible for identifying and developing analytics opportunities using Teradata Aster and Teradata’s analytics partner solutions. She is internationally recognised as a data mining thought leader and is regularly invited to present at international conferences on Mining Big Data. She is passionate about transforming businesses to make better decisions using their data capital.

    The post The Return on Analytics: What is the Show-Stopper? appeared first on International Blog.

    Teradata Blogs Feed

  • admin 9:47 am on June 25, 2015 Permalink
    Tags: ‘He, , Room, Said,   

    No Room For ‘He Said, She Said’ In Cyber Security 

    Teradata Articles

  • admin 9:46 am on June 25, 2015 Permalink
    Tags: , , , ,   

    Transform Finance Leverage Analytics and Technology 

    Teradata Web Casts

  • admin 9:51 am on June 24, 2015 Permalink
    Tags: , , ,   

    Pace Yourself When Solving Problems 

    I was a competitive distance runner in high school and college and continue to compete in local races. Like any sport, running requires a strategy. And part of that strategy is pacing—if you go all out the first mile, you’ll hit the proverbial “runner’s wall” a few miles later, run out of energy and won’t have a successful race. Instead, you want to establish a pace early on that you can sustain for the entire distance.

    My experience as a runner confirms what Teradata 14 Certified Master Carrie Ballinger says about pacing yourself—you need to set a tempo at the beginning of a race and continue it until the end. Ballinger also applies the concept to solving technical challenges. If you pull an all-nighter today, your mind won’t be as sharp tomorrow, and that’s when urgent requests might require your full attention and total focus.

    To read Ballinger’s Q&A on topics ranging from statements on Teradata® Active System Management, block sizes after a database migration, statistics histograms and other technical challenges, see her column in the Q2 2015 issue of Teradata Magazine.

    Brett Martin
    Teradata Magazine


    The post Pace Yourself When Solving Problems appeared first on Magazine Blog.

    Teradata Blogs Feed

  • admin 9:49 am on June 24, 2015 Permalink
    Tags: , , , , , ,   

    How You Think About Big Data For Cyber Security And What You re Doing About It May Not Agree 

    Teradata Articles

  • admin 9:48 am on June 24, 2015 Permalink
    Tags: General, , , , , Specialty,   

    Teradata Rapid Insights for Retail General Merchandise and Specialty 

    Teradata Brochures

  • admin 9:54 am on June 23, 2015 Permalink
    Tags: , InStore, , , PICKUP   

    Bringing Omni-Channel IN-STORE PICKUP To Life 

    Omni-Channel PrioritiesRetailers today are investing in new and different ways to deliver on the promise of right product, right time, right channel, for the right customer. Top retail delivery priorities include Ship-From-Store, In-Store Pickup, and In-Store Associate Ordering – and each is evolving at a different pace across the industry. Last month, I hosted the E-Tail East Summit in Atlanta and we explored these three topics.

    The next strategy for retailers to employ as part of their omni-channel consumer choice and convenience engagement model, is In-Store Pickup. With this fulfillment approach, the consumer gets the option of ordering online and picking up their products in-store for convenience, for immediate gratification, and to avoid shipping fees (although most retailers offer free shipping options).

    For the retailer, the consumer is in the store where they potentially purchase additional items. Also, the retailer is given the opportunity to engage the consumer in a positive, differentiated shopping environment and encourage return visits face-to-face (vs. email, etc).

    In a recent Forrester report, ISP was highlighted: “For some retailers, like Target, in-store pickup accounts for 10% of online sales. Store pickup is a key capability that retailers must embrace if they are to compete with online pure plays. 47% of consumers cited that they use store pickup to avoid online shipping costs, 25% use store pickup so they can collect their orders on the day they purchase them (thus avoiding the wait for shipping). From a retailer perspective, 52% of retailers cited inventory accuracy issues as a major barrier to the roll-out of these programs. With 25% of consumers using pickup as a means to obtain their purchase on the same day, it is perhaps no surprise that 41% of consumers expect to be notified that their order has been picked and is ready for collection in under an hour (18% expect their items to be ready in under 20 minutes).”

    Inventory accuracy is crucial to provide a good consumer experience.


    There are a number of key areas to consider when considering an in-store pickup strategy.

    Inventory Accuracy

    • Safety stock levels
    • Price discrepancies between online and in-store prices

    Customer Pick-up Location

    • Customer ease
    • 37% of shoppers purchase additional items when picking up in-store

    Store Ops / Training

    • Established protocols and training
    • Associate goals and metrics
    • Ensure pick and pickup process doesn’t interfere with in-store customers



    • Convenience of online shopping
    • Same day pickup
    • No shipping costs


    • 37% purchase additional products while picking up in-store
    • Leverage store assets
    • Increase foot traffic and upsell opportunity
    • Demonstrate unique and differentiated guest experience

    The post Bringing Omni-Channel IN-STORE PICKUP To Life appeared first on Industry Experts.

    Teradata Blogs Feed

  • admin 9:49 am on June 23, 2015 Permalink
    Tags: , , , Siren, Song   

    Is Your Big Data Safe Beware The Siren s Data Song 

    Teradata Articles

  • admin 9:48 am on June 23, 2015 Permalink
    Tags: Drug, , , , , ,   

    Teradata Rapid Insights for Retail Food Drug Mass 

    Teradata Brochures

  • admin 9:51 am on June 22, 2015 Permalink
    Tags: , , , , ,   

    Don’t Let Your Data Lake Turn into a Swamp 

    by Rick Stellwagen and Paul Barsch

    Q2-15_DataLakeenable smarter business decisions. However, the simple installation of a Hadoop cluster does not constitute a data lake. In fact, without following best practices, the investment will result in wasted time and money. 

    Manage from Start to Finish

    Imagine a single place in the enterprise where both transactional and multi-structured data types (Web logs, sensor data and other machine-to-machine communications) can be captured, stored and accessed. Also, in this same place, data can be profiled and reviewed before extensive modeling efforts occur so business insights can be gained more quickly.

    This hub of enterprise activity—where all data types can reside and be accessed for fast discovery—is called a data lake. It is primarily built on Hadoop because the technology has the ability to effectively scale in terms of volume, support higher data velocity and ingest all data types.

    Think Big, a Teradata company, has defined, implemented and managed data lakes for dozens of organizations while codifying best practices. By taking advantage of the practices outlined here, businesses can start their data lake initiatives effectively without spending time on rework and sorting through clutter.

    Think Big, Start Smart

    Developing a strategy and architecture is important to ensure big data success. The strategy does not need to take several months or years to complete. In fact, in as little as six weeks, Think Big can help a company identify and prioritize use cases, define an initial architecture, understand organizational readiness and decide which initiatives to launch first.

    And when it’s impractical to gather all the data owners to work on a strategy in a timely manner, Think Big offers a fixed-price starter service. This approach rapidly builds out two or three well-governed data streams to show business value quickly—without sacrificing a big data architecture and roadmap.

    This enables the creation of effective data management processes, ensuring that information propagated into the data lake is high quality and traceable. These processes can then serve as a foundation for discussions with other data stewards to promote expansion of the solution.

    Integrate Data Management Practices

    The well-known adage “garbage in, garbage out” also holds true for data lakes—some of which have turned into dumping grounds for corporate data. The idea that a tangled web of data can be useful for analysis without first considering and enforcing acceptable data management practices is a costly misconception. To provide tangible business value, careful attention must be given to how data is consumed, moved, tagged, transformed, managed, accessed and secured.

    Processes for data quality and metadata management must be established prior to bringing in the information. The journey to better quality starts by tracing both data and metadata from their sources. One example is to go beyond schema to include operational and business security metadata to enable proper governance.

    Metadata capture should be a continual process for good governance as the organization deploys profiling, masking, modeling and archiving techniques in its data lake. All points for access should be tracked and traced including how, when and who will use the information.

    Strive for Measurable Business Value

    For decades, the lion’s share of IT budgets has typically been dedicated to monitoring and maintaining technology systems rather than promoting innovation. Even though IT budgets are projected to increase slightly in the coming years, there is still not a lot of spend available for experimentation with new technologies.

    With this in mind, it’s extremely important for IT to set goals for showing measurable business value. For Hadoop-based data lake projects, follow these best practices:

    • Secure business sponsorship. Obtain a named leader and funding from one or more business units.
    • Commit to clear objectives. Work with the business to establish metrics and expectations not only for the project’s performance but also for what each group will do to achieve success.
    • Report regularly. Deliver results to sponsors and management every 60 to 90 days or other pre-arranged interval.

    With business committing to sponsorship and business objectives, and IT providing timely results and reporting, the project will be positioned as vital to both and more likely to stay on time and on track.

    Maintain Momentum

    There are many reasons why a data lake can fail. If it is not implemented correctly with data management, metadata capture, governance, security and a business focus, it could turn into a data swamp that costs millions and no one uses.

    An initiative that may span several million dollars should not be left to chance. By following the best practices of thinking big but starting smart, incorporating data management practices continuously and regularly reporting on business value, your data lake will continually fill, expand and benefit your company for years to come. 

    This article originally appeared in the Q2 2015 issue of Teradata Magazine.

    Rick Stellwagen is the Data Lake Program Director for Think Big, a Teradata Company. Paul Barsch directs marketing programs for Think Big.



    The post Don’t Let Your Data Lake Turn into a Swamp appeared first on Magazine Blog.

    Teradata Blogs Feed

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc