Archive for September, 2018

The Rise of Consumer-Centric Marketing: Made Possible Through Data Science

September 24th, 2018

Traditional business models — those based on mass advertising and comparatively impersonal marketing — are rapidly being turned on their heads as brands instead focus on forming individual connections to consumers themselves. These brands are modern, nimble and naturally tech-savvy, and they’re bringing a digital and data-driven business model to the table in order to become the driving force behind a new, consumer-centric economy.
Indeed, the rise and influence of direct-to-consumer (DTC) relationships should be well-noted. Big brands that fail to effectively engage their consumers are at risk of becoming obsolete. However, in the era of the omnichannel, how do you effectively reach consumers in a way that’s optimized across each and every channel?
Given the rise of mobile-first economies, marketers in any industry will tell you the answer is data. However, data by itself is useless if not made actionable, and brands like Marks & Spencer have figured out one solution by training employees to become well-versed in the language of analytics and data science. Improving data literacy throughout your organization helps enable marketers to map consumer preferences, concerns and buying habits at scale.
Modern marketing in a digital-first economy takes a multifaceted approach, but at its core, it is driven by data-backed decisions. The source of this data also varies widely given the rapid evolution of the consumer journey and the number of touch points along the way.
Having helped craft data-driven marketing strategies for more than a decade, I’ve seen firsthand that in order to effectively wrangle and consolidate across all touch points, as well as make correlations regarding consumer trends, the marriage between marketing and data science is more important than ever.
Rethinking Consumer Journey Mapping
Consumer-centric campaigns by their very nature require different marketing approaches. And engaging modern consumers is, in any case, a mix of science and art: the creative of marketing combined with the analytical power of data science. Marketers want to understand how to measure the consumer journey and find levers like personalized marketing to influence the path. The problem is that the consumer journey is complex, and the typical media mix models have long delays in analyzing consumer behaviour. Real-time action ability is a challenge, as models that don’t get retrained with new signals lose efficacy. This is where data science comes in.
Data orchestration to create actionable insights and applications is the newest piece of the puzzle and will come to be the norm. Increasingly we are also seeing identity services offered in programmatic, such as through The Trade Desk and Sizmek. This is because identity can prepare consumer journey mapping data, while the data science team assists with executing the planned strategy. This then enables marketers to personalize messaging and improve consumer engagement in real time, all while leveraging the creative of marketing to engage consumers effectively.
Merging marketing with data science ensures real-time signals are captured to allow for rapid testing so models can be optimized accordingly. The process, in turn, facilitates a better understanding of the consumer journey across all channels as well as a data-driven methodology to improve engagement.
One of the challenges in many organizations is that data scientists still spend 80% of their time accessing data, and only 20% of their time on data analysis and collaboration. With the right tooling, organizations can reduce manual tasks and optimize workflows.
Overall, it’s not just about the platform or tools, but it begins with the process. In order to get started, bring the data science and marketing teams together for rapid testing and learning. This means you need three things: tooling to bridge data and workflow between data and creative teams, the infrastructure to run tests and analyze them rapidly (and, ideally, automatically), and finally, a big enough user base that your tests are powered through appropriately, even over a short period of time.
Direct-To-Consumer: Wave Of The Future?
DTC brands have set the precedent and are thriving, but marketers within any industry — from start-ups to Fortune 500 organizations — will reap the benefits of establishing a “data economy.” Thinking from the perspective of a data scientist and data-first approach will also allow brand marketers to detect anomalies in their overall marketing strategies, anticipate hurdles and adjust accordingly.
During the initial rapid test phase, be curious and explore different options, paths and possibilities. With the right tools and strategies, continuous improvement becomes the norm, enabling marketers to approach the challenge from multiple angles and to test and weed out what does and doesn’t work in real time.
With two-thirds of all U.S. consumers expecting direct connectivity to brands, capitalizing on a consumer-first approach presents a significant opportunity for the modern marketer to integrate data science. Brands can ultimately achieve greater results through optimizing their traditional marketing stack by not only leveraging the power of data but understanding its full effects and potential implications.
Source: All the above opinions are personal perspective on the basis of information provided by Forbes and contributor Forbes Agency Council.
https://www.forbes.com/sites/forbesagencycouncil/2018/09/20/the-rise-of-consumer-centric-marketing-made-possible-through-data-science/#63f96e23729d

SAP SuccessFactors Gets Human over Digital HR

September 17th, 2018

Automation technologies are the current darlings of the tech zone. Driven by Machine Learning (ML)-powered Artificial Intelligence (AI) and big data analytics, if a process task or service can be defined, mapped, compartmentalized and defined as a discrete component, then in theory it can be automated by software.

So if we can digitize every aspect of business and life, then why not digitize us ourselves as humans? Okay, perhaps not our living tissue as such (although fully blown implanted bio-intelligence will surely be next), but our human needs in the workplace… that thing we call Human Resources (HR) or Human Capital Management (HCM).

Digitized HR conundrum

But as we automate and digitize into HR, which way does the balance shift — that is, does digital Human Resources force firms to become more digital, or does it in fact give them the opportunity to become more human?

German data software company SAP bought SuccessFactors at the back end of 2011 and has kept the company name to describe its HR applications and tools division. With new forays into Customer Relationship Management (CRM) and a history steeped in Enterprise Resource Planning (ERP) software, bringing the ‘personnel’ software factor into its stack at the start of this decade was a logical enough thing to do.

Given the opportunity to now digitally capture and empower many more staff actions in the workplace, SAP’s wider play is one that sees employee information channelled into a total proposition that it likes to brand as the Intelligent Enterprise (CAPS deliberate). The firm insists that it can make digital HR a more human-focused thing and it has recently expanded its SAP SuccessFactors software toolset with that specific strategic aim in mind.

Digital HR help for humans

The latest product developments from the SAP SuccessFactors camp now see the firm offering a new digital HR assistant. Currently in beta (pre-launch) form with a number of test-case customers, this is software designed to guide and recommend worker actions based on verbal and/or written questions or commands. SAP makes much of the Machine Learning (ML) element in its SAP Leonardo ‘design thinking’ brand and ML is highlighted here as a key function to allow the software to ‘learn’ more about what kinds of HR requests a user might make as it goes along.

This new digital assistant is built using the SAP Co-pilot bot framework and SAP Leonardo machine learning to create a conversational experience. It is also integrated with collaboration platforms including Slack and Microsoft Teams.

According to Andrea Waisgluss, user experience content strategist for SAP SE, users can chat asks questions and give commands to these chatbots just as they would a regular person. “The user’s informal and unstructured speech [is] then contextualized, analyzed and used to execute actions and present the user with business objects, options and other relevant data in a simple and conversational way,” she said.

Talent metrics – a means to measure humans

In terms of background intelligence to drive this new app and to help direct the machine learning engine in the new digital assistant, SAP SuccessFactors global head of marketing Kirsten Allegri Williams has explained that SAP has a catalogue of more than 2,000 ‘talent metrics’ (along with guidance on how to interpret the information) as the basis to accelerating the analysis of workforce and business issues.

“[These metrics include data focused on areas including] hire and hiring, learning, mobility through the organization and ‘span of control’, demographics and diversity, absence, performance, payroll, compensation, career paths, leadership and succession, through to retention and turnover… and ultimately, metrics related to business outcomes in terms of growth, revenue and profitability,” said Allegri Williams.

SAP CEO Bill McDermott has said that back when he was a teenager running a corner deli store, his CRM system used to be the front window pane and his HCM system was a hug [from a happy customer]. McDermott has also said that it’s really important that enterprises do not run their businesses based on the dissemination of emails to stipulate adherence to Key Performance Indicators (KPIs).

“Somehow we have to make these big companies feel like small companies again,” said McDermott.

Of course, there is another conundrum here. SAP makes the bulk of its money selling business analytics software that helps customers track what’s happening in their operational models down to a fine degree. We can perhaps safely assume that he would suggest we work with a reasonable mix of both humanity and digitization.

Ultimately, actually, digital HR might actually be a necessity. SAP SuccessFactors president Greg Tomb has noted that as much as 44% of company workforce spend today is channelled towards and spent on external workforce elements. Tomb also notes that the workforce is no longer a narrowly defined group of people. For most organizations, the workforce is a diverse, globally dispersed, mobile collection of individuals who are often disengaged from the enterprises they work for.

As part of extended product news, SAP has announced the creation of a new ‘open community’ intended to create purpose-built HR applications. The company hopes that small start-ups and larger established enterprises will come together to ‘co-create’ what could be large-scale applications or smaller ‘micro-apps’ (pieces of software with more limited specific functions) based around six initial pillars The new community is organised around apps that fall into six initial pillars: well-being; pay equity; real-time feedback; unbiased recruiting; predictive performance; and internal mobility.

“We believe this wave of innovation will result in a ‘human revolution’ that will allow businesses to focus time, talent and energy on the thing that really matters: the people that lead to business outcomes. With this community, we can help assemble a complementary set of solutions for our customers’ diverse needs. And, if they don’t exist yet, we can co-create them together,” said Tomb.

Digital HR humanity

So is there are a real difference between old school HR and new age digital HCM – and, back to our original question, does digital Human Resources negatively force firms to be more digital, or in fact allow them to become more human?

The answer lies in the fact that digital HR ‘should’ make companies more human if it is embraced and implemented correctly in a holistically connected way with multiple channels of access. If we apply it carefully, digital HR can help us identify bias, inequality in the workplace and also help us focus on human well-being, because we’ll know more about what people are actually doing in the roles they are assigned to.

Humans are obviously an integral part of so-called digital transformation on the road to cloud, web-scale business and ubiquitous connectivity, let’s just hope we can keep the human factor on the upper surface as we go forward.

Source: All the above opinions are personal perspective on the basis of information provided by Forbes and contributor Adrian Bridgwater.

https://www.forbes.com/sites/adrianbridgwater/2018/09/13/sap-successfactors-gets-human-over-digital-hr/#764aa9ed5fb3

Can Big Data Alone Keep Up With Ad Tech?

September 10th, 2018

Ad tech is unique, with its own distinct requirements and constraints. Digital advertising is becoming increasingly transacted via programmatic means, which demands technology that can not only accommodate extreme data volumes, but that can also process the data at the pace of real-time digital business. The question is, can ‘big data’ alone suffice for all the needs of the ad tech industry?

The Holy Grail of digital advertising is to reach the right consumer, with the right message, at the right time and in the right place. Keeping track of return on investment of marketing budgets with the right attribution is also very important. The challenge is to identify the right technology to mine the data and efficiently process it into a sell-able asset; it is the refining process that makes the raw data valuable.

At PubMatic, we understand that value lies in the quality of the data refinement. We are committed to providing high-quality reporting and analytics to empower our clients to leverage data at every stage of a campaign to inform programmatic activity and make smarter, faster business decisions. In building our platform, we defined three areas where we had to perform.

  1. Volume of Data
  2. Instant Decisioning
  3. Manageable Cost

Volume of Data

Successful customer engagement in the ad tech space demands lightning-fast queries on high volumes of complex data. We must be able to accommodate larger data sets and deliver more complex deals and services to largest clients. The deployment must be flexible enough to provide cost-effective and easy-to-consume services. We want to give our clients the ability to translate massive volumes of complex data into digital insight at unparalleled speed, with streaming data analysis and streamlined machine learning. To do this, we augmented our Big Data platform with a new class of technology focused on accelerated parallel computing. With Kinetica, a Graphical Processing Unit (GPU)-powered database that contributes high-speed data processing capabilities, PubMatic can empower our customers with real-time reporting and a sophisticated ad pacing engine.

Instant Decisioning

Advertising is the lifeblood of the internet, and digital advertising is increasingly transacted online programmatically, with eMarketer estimating that over 80% of digital display ads will be bought programmatically this year. Programmatic buying and selling of advertising uses real-time bidding to match marketers, who are trying to reach consumers across desktop, mobile, and over-the-top devices, with publishers and media companies that attract people with content. Digital advertising demand-side platforms (DSPs), sell-side platforms (SSPs), and centralized data management platforms (DMPs) and exchanges are dealing with a fire hose of real-time data that needs quick analysis to make advertising tick. At PubMatic, we needed to be able to sweep through vast volumes of complex streaming data in milliseconds, in order to create, target, and deliver ads with incredible speed and our signature precision. Technology-wise, we rely on the speed and parallel-processing power of Kinetica’s GPU engine to get the job done. Artificial intelligence powered by GPUs can optimize auctioning by discovering patterns and uncovering hidden insights in sub-seconds. By running ad decisioning algorithms, it’s easier for us to target the right audience and display the ads likeliest to appeal to them.

Manageable Cost

Programmatic trading operates at significant scale, with PubMatic generating over 400 terabytes of uncompressed data each day and processing over 10 trillion advertiser bids per month. However, the value of each individual transaction is relatively low compared with other industries. Therefore, the cost per transaction must be lower than many other industries; that means the infrastructure footprint has to be smaller. The ad tech industry leads in defining next-generation data platforms that can handle huge data sets with lower cost per byte requirements. We’re confident that adopting our technology criteria can only positively impact everyone’s bottom lines.

The key to success is getting the right tool for the problem. Digital advertising operates in the realm of extreme data, where it’s all about volume, speed, and cost. The increase in data volume is unpredictable but the costs can’t be. While it is easy to get stuck with familiar technologies, big data alone is not enough to keep up with the pace of ad tech.

Source: All the above opinions are personal perspective on the basis of information provided by Forbes and contributor Vasu Cherlopalle.

https://www.forbes.com/sites/kinetica/2018/08/17/can-big-data-alone-keep-up-with-ad-tech/#33556029c02f

Data Retention: Tough Choices Ahead

September 3rd, 2018

As the cost per byte of storage has declined, it has become a habit to simply store data “just in case.” At a time when the overwhelming majority of data was generated by human beings, nobody thought much of it. Data was summarized, information extracted from it and the raw data points were still kept should they be needed later. Later seldom came.

Cisco tells us that as of 2008, there were more things connected to the internet than people, so we can use that as the point in time when the amount of data being generated and stored had its hockey stick moment. Now we have more sensors in more places monitoring more and more activity and generating more and more data points. By 2010, then Google CEO Eric Schmidt explained how we generated and stored as much data every two days as we did from the dawn of civilization up to 2003.

That’s a lot of data.

Running Out Of Room, 0r…

The natural reaction is to instinctively feel that, at some point, we’re going to run out of storage capacity. If Moore’s Law holds, that won’t happen. We’ll just keep inventing new, more compressed storage technologies.

But what we are running out of is time.

Long ago, the last thing anyone in the data centre did was to make sure the daily backups were running. They would run into the night all by themselves. Then they would run through the night. Then they were still running when everyone came into the office in the morning.

Fortunately, we’re clever and adaptable, so we came up with incremental backups. Instead of recopying and recopying data we had already copied, we only copied data that had changed since the last backup. Then we moved to faster backup media. Now we’re backing up the data as we’re saving it in primary storage. Ultimately, the restore time objective becomes impossible to achieve in the time available to us.

Making Tough Choices

Now we have to make a difficult choice. Once we’ve processed the data and created valuable information, do we or do we not keep the original raw data as it was collected? Or do we decide to discard it?

Or do we have to choose to save some of the raw data and not other parts of it? What are the criteria upon which that choice can be made? How do we anticipate in our planning which data points need to be stored and which will be discarded?

Now Add Machine Learning

This problem becomes exacerbated by the introduction of machine learning and artificial intelligence technologies to data analytics. When a machine is performing much of the data collation, selection and processing, how are we to know which data points the machine will want to retrieve to complete its analysis? What if we choose incorrectly?

Other Possible Strategies

Being more pragmatic about this challenge, we need to think about data reduction. First of all, when and where does it occur?

Many of us take a physical relocation from one place to another as an opportunity to discard belongings that we no longer need. Some perform this discarding as they are packing to move. Others, often in a rush to make the move, simply pack everything and promise to do the discarding when they arrive at the new location. Many of us have boxes upon boxes that have yet to be unpacked since we moved in many years ago.

In the classic framework, we can choose to perform data reduction at the core of the network, in the server processors that will perform all the analytics. Or we can choose to perform data reduction at the edge where the data is being collected so the load on the servers and storage are reduced.

It is likely that the ultimate solution will be a combination of both, depending on the workload and the processing required.

Begin With The End In Mind

There has been much discussion about data science — how it’s the art of extrapolating useful information from data and turning it into knowledge that facilitates superior decision making.

As we continue to see the internet of things produce Schmidt’s estimate of five Exabyte’s per day, data science must expand its scope to include the development of an end-to-end data strategy. This must begin with careful planning and consideration surrounding the collection of data, layers of summarization and reduction, pre-processing and, finally, deciding which data points get stored and which are discarded.

As always is the case with data storage issues, this will be a volume-velocity-value process based on the business use case involved and at what point data gains value. The science is nascent, but the opportunity is immense.

Source: All the above opinions are personal perspective on the basis of information provided by Forbes and contributor Rick Braddy.

https://www.forbes.com/sites/forbestechcouncil/2018/08/27/data-retention-tough-choices-ahead/#49ac9a602c94