Updates from August, 2015 Toggle Comment Threads | Keyboard Shortcuts

  • admin 9:47 am on August 28, 2015 Permalink
    Tags: , , , JapanCentralized, ,   

    Japan-Centralized Management of Enterprise Data Hub and Integrated Data Warehouse 


    Teradata White Papers

     
  • admin 9:55 am on August 27, 2015 Permalink
    Tags: , , , ,   

    Teradata 2015 Partners Virtual Conference 


    Teradata Videos

     
  • admin 9:54 am on August 27, 2015 Permalink
    Tags: , , , , Pathway,   

    Run Before You Walk – Only Pathway to Embrace Analytics 

    A large Australian Government department was recently looking to set up their internal analytics capabilities. I was invited to present at their kick-off meeting where their IT team proposed the approach of first bedding down a reporting infrastructure and then afterwards move into complex data analysis – the phrase used was “we need to walk before we can run”. Based on my experience in successfully setting up analytics in many organisations, I suggested a different approach.

    This approach takes into account the highly unstructured nature of analytics projects and the possibility that all analytics ideas may not be successful. It enables quick testing of analytics opportunities with the ability to move rapidly into production if needed. The approach speedily searches the data landscape to conceive, test and deploy analytics ideas and supports business in fact-based decision making. Important enablers of this approach which may be implemented through business processes or software include the following.

    1. Access to operational data: In many organisations, access to data is limited to IT teams with the data scientists restricted to accessing cubes and reports. This regime slows momentum forcing data scientists to waste time foraging for data, building their own duplicate data marts on Excel and then having no means to operationalize ideas that result in good business outcomes. Direct read access to operational data for data scientists will remove these blockers.
    2. Space and tools to prototype: Business value of operational data increases when it is linked with other external data. For instance, my retail analytics team correlated sales with demographics from census using a regression model to develop a prototype for identifying new store locations. The space and modelling tools enabled the development of the prototype as well as identification of the data feeds needed for the production application.
    3. Support for rapid deployment: Once an analytics prototype demonstrates ongoing business value, it is important to leverage the prototype to move the analysis into production without a long and protracted IT engagement. This means:
      1. Using the prototype as the specification rather than have to write the requirements again;
      2. Restricting data feeds to those identified by the prototype; and
      3. Continuing to publish the analytics insights to business while the application is being developed and using that engagement to define the user interface.
    4. Strong governance: Given that not all of the analytics projects may be successful, a strong governance process is needed to ensure that databases created by abandoned projects do not hang around indefinitely, and that production applications do not access any of the prototype databases.

    Using these enablers, analytics teams can easily build momentum with the following.

    1. Analytics project pipeline: Data scientists need to have a pipeline of projects with multiple stake holders and business functions in order to ensure that analytics exposure is as broad as possible and mitigate the risk of some analytics opportunities failing or not being adopted by business.
    2. Early wins: When I embarked on setting up analytics capabilities at a manufacturing company, I was given 3 months to show the value of advanced analytics. This constraint provided the impetus to speedily identify and prototype an analytics opportunity. The insights from this prototype funded both the production and further investment into analytics. Analytics ideas need to move from conception to prototype in 6-12 weeks, with unprofitable ideas abandoned much earlier than that.

    And that brings me to the title. No baby ever learnt to walk by taking slow deliberate steps. Similarly no organisation is likely to succeed in setting up a brand new analytics capability by laboring over a couple of years to deliver an analytics infrastructure — except in the unlikely scenario of a patient management willing to continue to invest in unproven potential; unchanging market landscape and competitors at a standstill!! Even then what is delivered at the end may not be what was originally envisaged and the organization gives up analytics as a failure.

    So, my advice to organisations embarking on the analytics journey is: rather than walk first with business intelligence and reporting and then attempt to run with analytics on a reporting infrastructure, run first by exploring multiple analytics opportunities to yield quick business value, and then walk by deploying the reporting of the analytics so as to get ongoing value from the insights. This run before you walk” approach will set up the organization for data driven decision-making using both advanced analytics techniques and regular business intelligence reporting.

    Bhavani Raskutti is the Domain Lead for Advanced Analytics Teradata ANZ . She is responsible for identifying and developing analytics opportunities using Teradata Aster and Teradata’s analytics partner solutions. She is internationally recognised as a data mining thought leader and is regularly invited to present at international conferences on Mining Big Data. She is passionate about transforming businesses to make better decisions using their data capital.

    The post Run Before You Walk – Only Pathway to Embrace Analytics appeared first on International Blog.

    Teradata Blogs Feed

     
  • admin 9:54 am on August 27, 2015 Permalink
    Tags: , , ,   

    A Strategic Approach to Data and Analytics 


    Teradata White Papers

     
  • admin 9:54 am on August 27, 2015 Permalink
    Tags: , , , , ,   

    3 CTOs Discuss the Shift to Next Gen Analytic Ecosystems 


    Teradata Web Casts

     
  • admin 9:45 am on August 27, 2015 Permalink
    Tags: , , , JapanAccelerate,   

    Japan-Accelerate The Value of Big Data Analysis and Discovery 


    Teradata Brochures

     
  • admin 9:57 am on August 26, 2015 Permalink
    Tags: , , , , , ,   

    How Governments Can Engage Big Data Tech Tools for the Public Good 

    The effectiveness and legitimacy of government today is best demonstrated by its ability to deliver public services and accurate information to citizens when its citizens need it. This is truer than ever today, because we live in an age of unlimited information, when data anywhere can be captured, analyzed and transformed to personal insight in hours and minutes –with visual clarity. Today’s new technologies open doors and opportunities for government to better serve the public good.

    The social contract, which is the foundation of government, imposes specific responsibilities for its citizens, such as protecting lives, maintaining liberties, securing property rights and providing access to information. Many of us wonder what more they can do to hold true to the social contract.

    Billions of dollars are spent each year by our federal and state governments on technology and services – yet many citizens still cannot access or receive the public services and information they need. This is inexcusable – at a time when thousands of businesses across the world have been effectively using database tools and analytics to personally engage individual customers with relevance every day.

    The good news is that, while some government agencies acknowledge their challenges and seek solutions, a number of agencies are moving forward, fighting through the massive web of politics to improve the delivery of information and services to citizens. We can name these state agencies, because they are using Teradata analytics and data integration services to recover mega-millions for their respective economies. At the same time, these agencies are better serving their citizens. The point is – this can be done by government agencies that are on the leading edge.

    Now that more powerful tools and services — to manage data, and provide analytical insight — have become commonplace, government agencies can acquire them – to quickly deploy and use these to better meet their obligations to the public. With an integrated database, they can see important connections that they have missed in the past, link unique identities of citizens with entitlements, and discover opportunities to operate more efficiently while supporting and benefiting citizens.

    Today’s database engines can help establish a direct, accountable connection between citizens and the state. Governments are already developing data-rich repositories based on unique citizen identifiers, or biometrics, which opens a wide range of possibilities for improving service delivery.

    So, we are seeing steps taken by government agencies, but they are baby steps – at a time when commercial businesses continue to take steady leaps and realize enormous benefits.

    By using database engines to link social services with citizens, governments can cut through the political red tape and optimize public resources for the benefit of citizens.

    Data-driven, evidence-based policy

    The recent introduction of big data tools changes prospects for the better. Governments can meet the requirements of citizens – and increasingly, with real-time information. Policymakers can now leverage real-time data to help fulfill their responsibilities – and honor the social contract.

    In 2009, the discovery and spread of the new H1N1 virus (a combination of bird and swine flu) posed an immediate health risk. The U.S. Centers for Disease Control and Prevention (CDC) first relied on doctors to manually tabulate and report cases to determine where the vaccine was needed most. Unfortunately, the resulting one- to two-week delay was simply too long for an infected patient to wait.

    Enter a Google software program: it identified 45 key search terms that, when combined in a mathematical model, reported a strong correlation between the prediction of spreading and official figures nationwide. This allowed the CDC to report an accurate status in real time, not weeks after the fact. This approach provided a method for the data to speak; and when it did, it made a difference when it truly mattered.

    Conversely, data that is improperly managed or ignored can have devastating results. In the case of Pakistan, 1.3 million doses of vaccines donated by the United Nations Children’s Fund (UNICEF)—costing U.S. $ 3.7 million—was wasted by health authorities because they didn’t know where it was needed most, or how to properly preserve it. In a country where 1 in 10 children do not survive their fifth birthday, neglecting the use of big data proved costly.

    Then, there’s the state of Michigan, which has achieved a benefit of $ 1 million per business day in financial benefits, mostly in the Medicaid/Health and Human Services area. How? The renowned success of Michigan’s Data Warehouse is about far more than the number of people who use it and the amount of data that is shared – those facts are a means to an end. The true measure of its success is this: No other state in America has achieved such concrete and impressive business results from its innovative use of a Teradata database engine to improve outcomes, reduce costs, streamline operations, and manage programs. Nor has any other state been so ambitious in its attempts to solve as many real-life problems through the innovative sharing and comprehensive analyses of data.

    To meet the complex, growing demands of modern citizenship, governments can no longer afford to ignore the potential of big data. Technology offers the promise to not only collect real-time data, but also the power to process huge amounts of it, fast. And choosing between census, statistics and big data is no longer necessary—all three can be used to confirm or reject inference, and help ensure responsive well-informed policymaking. The result can mean real, qualitative improvements in governance—and a measurable boost in the quality of life for its citizens.

    Tariq Maliq bio pic

    TARIQ MALIK is Senior Industry Consultant (Government Systems) Teradata, and former chairman of the National Database & Registration Authority (NADRA) Pakistan. Before joining Teradata, Malik spearheaded one of the world’s largest multi-biometric system roll outs resulting in registration of more than 100 million citizens. A visionary, Tariq is an IT leader who has used the Teradata Enterprise data warehouse for development, empowering women, strengthening democracy, reforming governance, increasing tax net and reducing poverty in Pakistan.

    The post How Governments Can Engage Big Data Tech Tools for the Public Good appeared first on Industry Experts.

    Teradata Blogs Feed

     
  • admin 9:47 am on August 26, 2015 Permalink
    Tags: , , , ,   

    In Memory Processing for High Performance Analytics 


    Teradata Web Casts

     
  • admin 10:34 am on August 25, 2015 Permalink
    Tags: ,   

    2015 IM Symposium 

    Teradata will be at the 2015 Information Management Symposium in Amelia Island, Florida, September 20 – 23rd. We hope you’ll join us at Booth #9! You’ll be welcomed by industry-trained Teradata healthcare consultants ready to answer your questions, and show you how to unlock the potential of big data in healthcare. Teradata empowers Blue Cross Blue Shield organizations to realize the value of data and analytics to achieve high-quality, low-cost care coordination; precision consumer engagement; integrated finance performance management; and improved population health. Stop by Booth #9 to learn more.
    Teradata Events

     
  • admin 9:54 am on August 25, 2015 Permalink
    Tags: , , , , Soul   

    Consumer Demand – The Heart and Soul of Retail 

    Over the last few months as I have transitioned into primarily supporting retail customers, I have been updating my knowledge of current trends in customer analytics, personalisation and data-driven marketing. Not surprisingly, this then led me to the topics of omni-channel and the connected consumer (which will no doubt give me plenty of ideas for future blogs).  An area of strength for Teradata retail analytics is Demand Change Management (DCM) and as DCM solutions are not new, I reached out to one of my colleagues, Nick Scott (Principal Retail Industry Consultant) to get his perspective on the role DCM plays for today’s retail organisations:

    “Retailers are constantly challenged by their customers. The evolution of multi to omni-channel has meant a mind-shift on how best to manage consumer demand. Digital shopping tells us so much about the customer and how they want to engage. What often gets missed is trapping this data to determine paths to purchase, customer sentiment, and the method of purchase chosen by the customer.

    Ultimately, all the customer wants is a retailer to have their products available from any channel, at any time, and at the right price. Retailers need the ability to capture these shifts in demand and adaptively forecast this demand to ensure stock is replenished and/or allocated to the right location for quick customer turn-around. This can be complex unless a retailer has the ability to crunch millions of rows of demand data to effectively model demand, tune forecasts, and overlay promotional programs to derive time-phased orders. Teradata’s Demand Chain Management (DCM) solution allows retailers to achieve demand forecast centricity to drive service level and be core to all retail support functions.”

    The EKN Industry Point of View paper that is available for download, The Powerful Potential of Well-Tuned Demand Chain Management reinforces the fact that a proactively managed demand chain is a must-have in the era of omni-channel retail. “The retail C-suite’s most strategic business goals remain focused on driving profitable business growth through deeper customer engagement and continued business efficiency improvements. However, the starting point of deeper customer engagement is in developing a comprehensive understanding of consumer demand and what this means to the business. It is also critical to decipher patterns emerging there in to manage service-levels, inventory sell-through and turns.”

    The report goes on to highlight the need to sync marketing and supply chain, providing one view of customers and products, “deeper customer profiling analysis, demand forecasting and inventory/supply chain preparedness at the SKU and location-level are some of the critical steps for creating one view of the consumer and inventory that helps with top line and bottom line attainment” and this continues to be a key topic in customer discussions.

     

    One of the benefits of conducting cross analysis of customer and product is improving the ability to forecast consumer demand.

    “The ability to analyse Omni-channel demand data and resultant forecasting, ordering, replenishment and fulfillment has to evolve with the changing nature of customer transactions where the demand signal and fulfillment may take place in different locations. This can be addressed in two ways. First, as consumers become more Omni-channel, tech-savvy, value-conscious and aware, it is a requirement for retailers to track consumer demand data when the initial signal takes place within the channel or store-level. Second, fulfillment location demand must also be considered during specific order, replenishment or fulfillment scenarios across channels. “

    What I find interesting is in learning about DCM capabilities how it quickly bridges back to customer analytics. Perhaps it is not only about omni-channel capabilities, rather enabling an omni-view of the organisation encompassing both customer and products across any channel.

    Monica Woolmer has over 25 years of IT experience who has been leading data management and data analysis implementations for over 15 years. As an Industry Consultant Monica’s role is to utilise her diverse experience across industries to understand client’s business; articulate industry vision and trends; and to identify opportunities to leverage analytics. Monica has a cross-industry focus and is currently primarily assisting Retail and Public Sector clients across Australia and New Zealand. Connect with Monica via Linkedin.

    The post Consumer Demand – The Heart and Soul of Retail appeared first on International Blog.

    Teradata Blogs Feed

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel