Tagged: Ecosystem Toggle Comment Threads | Keyboard Shortcuts

  • admin 9:51 am on February 8, 2017 Permalink
    Tags: , , Britain, , , Ecosystem, , , , , , Prosper, Serving   

    Lloyds Banking Group: One Ecosystem Serving Multiple Brands Delivering Business Outcomes to Help Britain Prosper 

    Latest imported feed items on Analytics Matters

     
  • admin 9:48 am on January 10, 2017 Permalink
    Tags: , , , Ecosystem, , , Shrewd,   

    Building the Right Analytical Ecosystem Architecture Takes Shrewd Planning 


    Teradata Articles

     
  • admin 9:51 am on September 17, 2016 Permalink
    Tags: , , , , Ecosystem, , , , ,   

    Teradata’s Borderless Analytics Turns Hybrid Clouds into a Single Analytic Ecosystem 

    Provides seamless shifting of analytic workloads across a multi-system hybrid cloud environment
    Teradata United States

     
  • admin 9:52 am on December 18, 2015 Permalink
    Tags: , , Diverse, Ecosystem, , , , , , ,   

    Teradata Offers Unmatched High Performance Database and Diverse Ecosystem of Alliance Partner Solutions 

    Large, diverse ecosystem of alliance partners makes it faster and easier to deploy and get value from Teradata technology
    Teradata News Releases

     
  • admin 9:55 am on October 23, 2015 Permalink
    Tags: , , , Ecosystem, , simplify, , ,   

    Teradata Unifies Technologies to Accelerate Performance and Simplify Deployment of Analytic Ecosystem 

    Teradata significantly enhances performance and ease of use when executing analytics that span the Teradata® Unified Data Architecture™
    Teradata News Releases

     
  • admin 9:54 am on August 30, 2015 Permalink
    Tags: , Ecosystem, Pluralism, Secularity   

    Pluralism and Secularity In a Big Data Ecosystem 

    Solutions around today’s analytic ecosystem are too technically driven without focusing on business values. The buzzwords seem to over-compensate the reality of implementation and cost of ownership. I challenge you to view your analytic architecture using pluralism and secularity. Without such a view of this world your resume will fill out nicely but your business values will suffer.

    In my previous role, prior to joining Teradata, I was given the task of trying to move “all” of our organization’s BI data to Hadoop. I will share my approach – how best-in-class solutions come naturally when pluralism and secularity are used to support a business-first environment.

    Big data has exposed some great insights into what we can, should, and need to do with our data. However, this space is filled with radical opinions and the pressure to “draw a line in the sand” between time-proven methodologies and what we know as “big data.” Some may view these spaces moving in opposite directions; however, these spaces will collide. The question is not “if” but “when.” What are we doing now to prepare for this inevitability? Hadapt seems to be moving in the right direction in terms of leadership between the two spaces.

    Relational Databases
    I found many of the data sets in relational databases to be lacking in structure, highly transient, and loosely coupled. Data scientists needed to have quick access to data sets to perform their hypothesis testing.

    Continuously requesting IT to rerun their ETL processes was highly inefficient. A data scientist once asked me “Why can’t we just dump the data in a Linux mount for exploration?” Schema-on-write was too restrictive as the data scientists could not predefine the attributes for the data set for ingestion. As the data sets became more complex and unstructured, the ETL processes became exponentially more complicated and performance was hindered.

    I also found during this exercise that my traditional BI analysts were perplexed with formulating questions about the data. One of the reasons was that businesses did not know what questions to ask. This is a common challenge in the big data ecosystem. We are used to knowing our data and being able to come up with incredible questions about it. The BI analyst’s world has been disrupted as they now need to ask “What insights/answers do I have about my data?” – (according to IIya Katsov in one of his blogs).

    Hadoop/NoSQL
    The product owner of Hadoop was convinced that the entire dataset should be hosted on Amazon Web Services (S3) which would allow our analytics (via Elastiv Map Reduce) to perform at incredible speeds. However, due to various ISO guidelines, the data sets had to be encrypted at rest and in transit which degraded performance by approximately 30 percent.

    Without an access path model, logical model, or unified model, business users and data scientists were left with little appetite for unified analytics. Data scientists were on their own guidelines for integrated/ federated/governed/liberated post-discovery analytical sets.

    Communication with the rest of the organization became an unattainable goal. The models which came out of discovery were not federated across the organization as there was a disconnect between the data scientists, data architects, Hadoop engineers, and data stewards — who spoke different languages. Data scientists were creating amazing predictive models and at the same time data stewards were looking for tools to help them provide insight in prediction for the SAME DATA.

    Using NoSQL for a specific question on a dataset required a new collection set. To maintain and govern the numerous collections became a burden. There had to be a better way to answer many questions without having a linear relationship to the number of collections instantiated. The answer may be within access path modeling.

    Another challenge I faced was when users wanted a graphical representation of the data and the embedded relationships or lack thereof. Are they asking for a data model? The users would immediately say no, since they read in a blog somewhere that data modeling is not required using NoSQL technology.

    At the end of this entire implementation I found myself needing to integrate these various platforms for the sake of providing a business-first solution. Maybe the line in the sand isn’t a business-first approach? Those that drive Pluralism (a condition or system in which two or more states, groups, principles, sources of authority, etc., coexist) and Secularity (not being devoted to a specific technology or data ‘religion’) within their analytic ecosystem — can truly deliver a business-first solution approach while avoiding the proverbial “silver bullet” architecture solutions.

    In my coming post, I will share some of the practices for access path modeling within Big Data and how it supports pluralism and secularity within a business-first analytic ecosystem.

    Sunile Manjee

    Sunile Manjee is a Product Manager in Teradata’s Architecture and Modeling Solutions team. Big Data solutions are his specialty, along with the architecture to support a unified data vision. He has over 12 years of IT experience as a Big Data architect, DW architect, application architect, IT team lead, and 3gl/4gl programmer.

    The post Pluralism and Secularity In a Big Data Ecosystem appeared first on Data Points.

    Teradata Blogs Feed

     
  • admin 9:48 am on August 12, 2015 Permalink
    Tags: Bloor, Ecosystem, , ,   

    The Bloor Group Teradata and the Hadoop Ecosystem 


    Teradata White Papers

     
  • admin 9:56 am on May 23, 2015 Permalink
    Tags: , Ecosystem, ,   

    DISRUPTING THE CPG MARKETING SERVICES ECOSYSTEM 

    MarketingConsumer goods marketers employ a cadre of contractors, agencies and consulting firms. That’s not going to change rapidly in the next 1-2 years, but what will change is the role these parties play with the Consumer Goods manufacturer. Something must change. According to a recent Deloitte survey of 4,047 respondents encompassing 28 product categories and more than 350 brands, brand loyalty is declining. The Deloitte study cited in this Inc. article describes price-sensitive consumers still reeling from recession as the norm, challenging any attempts to develop brand affinity. The solution? “Brand segmentation,” which requires consumer goods companies to “Rethink their product portfolio in light of the widening gap between the affluent and lower-income households. Consumer products companies may need to have distinct strategies (e.g., brands, product offering, pricing) to target affluent and lower-income consumers.”

    That sounds like a data and analytic problem. How prepared are CPG brand marketers to attack it?

    The Consumer Data Asset

    Data about your consumers should be your data – not something outsourced, scattered among contractors or agencies. This data and its insight could be an asset that drives better decision-making across the organization. If it were, it would be accounted for on your company’s balance sheet. None of this means you need to necessarily own the technology that collects analyzes and puts consumer data to use. It does imply that traditional marketing services companies purporting to offer this capability simply cannot.

    Marketing service provider business models are based on the premise that their clients care little about the underlying data and marketing technologies working behind the scenes. Instead, they emphasize servicing virtually any outsourcable marketing need. Diversity to this extent makes for thinness in certain areas – and this is becoming more and more apparent as business demands rapidly shift out of traditional marketing techniques. It is leading many to retreat from the strategy of using “one” agency for all needs and shifting back to preferring best of breed providers for specific business needs.

    Marketing services providers are largely ill-equipped to handle the realities of a data driven world – one characterized by big data, the proliferation of consumer channels and rapid technology innovation. The pace of change is so fast; only firms putting their full prowess behind adapting are capable of delivering the most differentiated capabilities.

    To fill the agency gap, CPG business leaders are investing in and partnering with new entrants in the technology space – especially in the areas of creating new social networks, mobile applications and other ways of engaging consumers. In a data driven world, the quality and effectiveness of marketing technology isn’t an inconvenience to be delegated to a services company; it’s a mission critical competency that generates insight with immense value.

    Picking the Right Partner

    In October 2014, McKinsey’s Global Co-Lead of Digital, David Edelman, posted an article to LinkedIn titled, “Time for Marketers and Agencies to Shake it Up.” His piece describes the need for agencies to change the nature of their business models to focus on higher value services. He also says:
    “Who do you need as a partner? When every different channel has its own specialist agencies claiming expertise in it, a client can get overwhelmed by a mix of as many as 14 different traditional, media, digital, social, mobile, sponsorship, etc. agencies. If each corporate business unit and geography gets to choose their own agency, the complexities are overwhelming and negate much opportunity for scale or cross-channel coordination.”

    Agencies and marketing services companies are not prepared to address these issues – it’s simply not their traditional business. Replacing redundant and wasteful database and technology development efforts across the contractor ecosystem with standards helps agencies deliver the most impactful and measurable creative. Standards also yield consistent and comparable metrics across brands and campaigns; an absolute necessity at a time when marketers are under the microscope (“The heightened focus on marketers and their related costs will spur marketers to better use analytics”).

    Bottom Line: The digital marketing business model employed by most Consumer Products companies is not flexible nor suited to enable data driven business priorities. Outsourcing is actually less efficient in a world where latency between insights and actions is not something struggling CPG brands can afford. Utilize agencies and marketing service providers for what they are “best at” (e.g. content, creative, web, mobile, programmatic, etc) and clearly define your internal data driven marketing strategy which includes who owns the data, where the consumer data should be stored and maintained, definition of insights and analytics, etc.

    The post DISRUPTING THE CPG MARKETING SERVICES ECOSYSTEM appeared first on Industry Experts.

    Teradata Blogs Feed

     
  • admin 9:47 am on May 1, 2015 Permalink
    Tags: , , Ecosystem, , , , ,   

    Teradata rolls out platform updates designed to connect Hadoop ecosystem 

    Teradata Press Mentions

     
  • admin 9:54 am on April 19, 2015 Permalink
    Tags: 2Step, , Ecosystem, Solid,   

    2-Step Solution for a Solid Data Ecosystem 

    by Brian Richards and David R. Schiller, CCP

    From the outside looking in, data integration (DI) seems easy, right? You just combine data residing in different systems to provide users with a unified view of the individual data elements in a timely manner. All you need is a database, a few SQL statements, some scripts and voila, perfection! However, delivering data—including big data—and information system DI is anything but simple.

    DI runtimes can account for 70% to 80% of the overall data ecosystem workload. And as the need for business analytics grows, system resources that were previously available for data integration are now being required for these analytics requests. Organizations then have to look at tuning, redesigning and expanding the DI function within their analytics ecosystem.

    Teradata® Consulting Services offers Data Integration Optimization Services that can assist with the evaluation of all options for improving DI, including deciding whether to modify extract, transform and load (ETL) code, re-architect ETL processes or extend the ecosystem with options such as Apache™ Hadoop® or a Teradata system.

    Offload or Optimize
    As new technologies, business requirements, data sources and other factors are introduced, a rebalancing or data integration optimization (DIO) effort is often required. Rebalancing entails changing the architectural model and offloading non-analytic processes like ETL to another platform to free the data warehouse to concentrate on analytics.

    However, offloading ETL processes is not easy since many dependencies need to be considered and weighed first. Plus, most organizations have written their ETL processes over a period of time, and even if they applied the best practices of the day, they are still yesterday’s approaches. Those processes must therefore be evaluated to find ways to lower costs now while keeping an eye toward the future.

    Another approach is to optimize the DI in place. Yet optimizing existing implementations can be challenging since doing so requires:

    • Resources
    • Spending money to “fix” a solution that has already been purchased
    • Using existing tools
    • Re-implementation expenses
    • New business drivers as a result of changing priorities
    • Balance of potential gains versus expenses

    With such a wide variety of trade-offs available, Teradata Consulting Services recommends assessing the DI environment and then systematically tuning, redesigning and expanding or extending the ecosystem.

    2-Step Solution
    An organization’s DI practices are often not the best, and the data loading processes and environment may have actually eroded over time. Although a company often assumes that offloading is the answer, Teradata experts, based on real-world implementation, have found that re-architecting DI processes recaptures and returns valuable CPU cycles back to the data warehouse for business analytics use. Teradata Consulting Services offers a two-step approach for DIO:

    Step 1: DIO Assessment
    Critical components of an assessment are a goal and performance metrics. Thanks to query banding technology from Teradata, it is easy to determine how specific DI functionalities relate to utilization on the Teradata Database. (An example of a Teradata query banding implementation with Informatica Connections can be found on the Teradata Developer Exchange.)

    Defining a measurable goal for the optimization, for instance recovering 50% of the CPU from the analytic database for user availability, is critical to success. The metrics, then, should include costs, which can guide the value proposition of the recommendations.

    With metrics and a goal, Teradata Consulting Services can provide a recommended approach. Based on the timeline, risk, cost, roadmaps, service level agreements (SLAs), critical execution windows and other factors, the services specialists will usually recommend a combination of options to meet specific objectives as the project increases in complexity and cost. Typically, tuning is on the low end of the spectrum and extending the ecosystem is at the top.

    Step 2: Engineer the Solution
    Armed with Teradata Consulting Services recommendations, organizations can begin the implementation, testing and deployment. Throughout this process, baseline metrics from the DI optimization assessment should be measured against the established goals.

    Once the DIO is complete, organizations can expect to see improved SLAs and CPU cycle recovery. The DI strategy and architecture, once determined, can establish consistent patterns to enable DI delivery that increases productivity and decreases costs.

    Leverage Proven Techniques
    The answer to benefitting from solutions is as clear today as it has been across many generations of new capabilities and platforms. It’s to use the available platforms for their strengths, limit complexity and focus on the delivery of value—not just try to utilize the latest and most hyped technology.

    That’s why the Teradata approach to DI is to assess, plan and execute an engineered solution using proven techniques from leaders in the field. Through experience, expertise and thought leadership, Teradata DIO experts leverage multiple vendors across various domains to provide a balanced solution for a solid ecosystem.

    Read this article and more in the Q1 2015 issue of Teradata Magazine.

    Brian Richards is a partner with Teradata Professional Services. He has more than 25 years of IT experience and leads the Enterprise Data Management Center of Experience (COE) within Teradata.

    David R. Schiller, CCP, has nearly 30 years of IT experience. He manages Teradata Professional Services marketing programs.

     

    The post 2-Step Solution for a Solid Data Ecosystem appeared first on Magazine Blog.

    Teradata Blogs Feed

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel