SAP S/4HANA for central procurement overview

SAP S/4HANA for central procurement, what it’s all about?


The Procurement line of business in larger companies is organized mostly either globally or structured in divisions and channels, responsible for regions, countries and locations. They manage the spend processes often in distributed procurement landscapes with numerous ERP back-end systems.

Procurement is particularly impacted by this lack of centralization to improve and automate scale and volume in order to save costs and process more efficient.

SAP S/4HANA for central procurement provides a unified interconnection Hub, that allow procurement teams to work as a shared service across multiple ERP systems of global organizations, divisions, and channels.

It delivers the benefits of a centralized procurement system without requiring a massive shift of already established local procure to pay process flows. It operates across such buying channels without the need to rework them.

SAP S/4HANA for central procurement, solution overview


SAP S/4HANA for central procurement delivers company-wide visibility based on roles and permissions across a company’s ERP landscape. Lead buyers can access contracts, requisitions, orders, status and spending across the company.

Usually a purchaser works in one or multiple SAP ERP or SAP S/4HANA back-end systems. With central procurement, they can use a single SAP S/4HANA system to monitor, process and analyze the relevant documents centrally.

A strategic purchaser or commodity manager is able to structure global contracts hierarchically and to distribute them into local sources of supply in terms of outline or scheduling agreements.

The solution covers the following business scenarios:

  • Central Requisitioning (2FI)

  • Central Sourcing (3ZF)

  • Central Purchase Contracts (2ME)

  • Central Purchasing (2XT)

  • Central Purchasing Analytics (1JI, 2ME, 2XT)

  • Guided Buying for Central Procurement with SAP Ariba Buying (3EN)


What to Know About SAP HANA Installation for Certification [With Practice Q&As]

These practice questions will help you evaluate your understanding of SAP HANA installation ahead of an SAP HANA certification.

The questions shown are similar in nature to those found on the certification examination. Although none of these questions will be found on the exam itself, they will allow you to review your knowledge of the subject. Select the correct answers, and then check the completeness of your answers below.

Practice Questions

Q1: For which SAP HANA scenarios is sizing from an SAP sizing expert required? (There are three correct answers.)

  1. When consolidating multiple source systems into a single system
  2. When carving out functionality from the source system
  3. When migrating a high-volume legacy system
  4. When migrating SAP BW and SAP Business Suite
  5. When migrating SAP NetWeaver-based systems from AnyDB to SAP BW/4HANAand SAP S/4HANA

Q2: Where can you find certified and supported SAP HANA hardware appliances centrally listed?

  1. On the SAP-certified SAP HANA hardware directory
  2. On the PAM
  3. On the SAP ONE Support Launchpad
  4. On the websites of SAP-certified hardware partners

Q3: Which templates are available in the Quick Sizer tool for SAP HANA? (There are two correct answers.)

  1. SAP S/4HANA Cloud
  2. SAP NetWeaver
  4. Standalone SAP HANA

Q4: What’s the difference between sizing for an SAP HANA appliance and TDI?

  1. SAP HANA appliance sizing is performed by the hardware partner.
  2. You don’t need to consider CPU and storage sizing for an appliance.
  3. The SAP HANA appliance is preconfigured and doesn’t require sizing.
  4. The Quick Sizer tool can only be used in TDI sizing projects.

Q5. How do you size brownfield implementations? (There are two correct answers.)

  1. Use the Quick Sizer tool for SAP HANA.
  2. Use the latest sizing report for the application attached to an SAP Note.
  3. Brownfield implementations always require expert sizing.
  4. Consult the SAP in-memory computing sizing guidelines.
  5. Use the SAP HANA sizing decision tree.

Q6. How do you perform SAP HANA in-memory sizing for generic migration scenarios?

  1. Use a formula that considers the source data footprint (tables only), adds the requirements for dynamic objects, divides by an assumed compression ratio, and multiplies by the source database-specific compression factor (if applicable).
  2. Use the latest sizing report for the application attached to an SAP Note.
  3. Use the Quick Sizer tool for AnyDB.

Q7. How do you calculate disk space requirements for generic migration scenarios?

  1. Use a formula that considers the net data size on disk plus 20% additional space for delta merge operations; plus anticipated growth for the data volume and 0.5 * RAM with a maximum of 512 GB for the log volume; plus 1 * RAM for the software installation with a maximum of 1TB.
  2. Calculate the required inputs/outputs per second (IOPS).
  3. Use the rule of thumb of three times RAM.

Q8. Which technology directly impacts SAP HANA memory sizing?

  1. SAP HANA smart data access
  2. SAP HANA dynamic tiering
  3. SAP HANA persistent memory
  4. SAP HANA extension nodes

Q9. Which is the recommended approach to perform a comprehensive hardware configuration check for custom SAP HANA installations?

  1. The hardware configuration check is performed automatically by the SAP HANA installer prior to server software installation.
  2. Implement the recommendations and requirements of the SAP Notes listed in the SAP HANA Installation and Update Guide.
  3. Running the hardware configuration check tool with the appropriate configuration template files.
  4. Hardware configuration checks are only available as part of the SAP GoingLive Check.

Q10. Which operating systems are supported for the SAP HANA server? (There are two correct answers.)

  1. Ubuntu Server LTS
  2. SUSE Linux Enterprise Server (SLES)
  3. Red Hat Enterprise Linux (RHEL) for SAP solutions
  4. openSUSE Leap for SAP applications

Q11. Which network zone is used for SAP HANA system replication?

  1. Internal
  2. Storage
  3. Local
  4. Client

Q12. Which file system is recommended by SAP for SAP HANA?

  1. NTFS
  2. EXT3
  3. IBM Spectrum Scale (GPFS)
  4. XFS


Practice Question Answers and Explanations

Question 1

Correct answers: ABC

 For new SAP software implementations (greenfield) on SAP HANA, SAP recommends using the Quick Sizer tool. For migrations of SAP NetWeaver-based applications, there are specific sizing reports (SAP Notes) depending on whether you’re interested in SAP BW on SAP HANA, SAP Business Suite on SAP HANA, SAP BW/4HANA, SAP S/4HANA, or other applications. Any system that is large or complex requires sizing from an SAP sizing expert.

Answers D and E: these are standard brownfield sizing scenarios addressed in the respective SAP Notes.

Question 2

Correct answer: A

 The SAP-certified SAP HANA hardware directory is shown in the figure below. Answer B is incorrect because the PAM contains a link to the hardware directory but doesn’t provide information about certified and supported appliances. Answer C is incorrect because the SAP ONE Support Launchpad contains information about SAP software but not about SAP-certified hardware. Answer D is incorrect because you might find information about SAP-certified appliances listed on the websites of SAP hardware partners, but this isn’t a centrally listed directory.

Question 3

Correct answers: CD

There are three different versions of the Quick Sizer tool. The version for SAP HANA only contains templates for SAP applications running on SAP HANA, including both the “powered by HANA” releases and the “/4HANA,” but not cloud-based editions or for generic (AnyDB) sizing for products such as SAP NetWeaver.

Answer A is incorrect because the Quick Sizer tool for SAP HANA can’t be used to size SAP S/4HANA Cloud. For this, use the SAP S/4HANA Cloud Quick Sizer. Answer B is incorrect because there is no sizing tool for SAP NetWeaver (product family) or SAP NetWeaver AS for ABAP (application server), but there are Quick Sizers for SAP NetWeaver-based applications such as SAP Business Suite both when powered by SAP HANA or when running on AnyDB.

Question 4

Correct answer: B

SAP HANA appliances are standardized offerings and can’t be mapped to business requirements in an iterative approach. However, even with appliances, sizing is needed to select the right appliance for the workload.

Answer A is incorrect because the hardware partner doesn’t perform sizing for the appliance as the business requirements aren’t known. Answer C is incorrect because the appliance is preconfigure but does need to be sized for the workload. Answer D is incorrect because you can use the Quick Sizer tool for both appliance and TDI configurations. The Quick Sizer tool is typically used in new implementations (greenfield), whereas for SAP application migration projects (brownfield), different SAP Notes with specific SQL scripts attached are used.

Question 5

Correct answers: BD

Brownfield implementations refer to migration projects of existing SAP applications from AnyDB to SAP HANA. For standard scenarios, such as migrating SAP BW to SAP BW/4HANA, there are specific SAP Notes that detail the actions to perform. Additional information can be found in the sizing guides listed on the SAP Help Portal under “SAP In-Memory Computing Sizing Guidelines.”

For new installations, also known as greenfield projects, you use the Quick Sizer tool for SAP HANA.

Complex migration projects typically require expert sizing, but this isn’t a requirement per se for brownfield implementations.

The SAP HANA sizing decision tree shows the different sizing approaches for greenfield and migration sizings with guidance on which SAP Note to use.

Question 6

Correct answer: A

 This formula is documented in the SAP HANA In-Memory Database Sizing Guideline (PDF) document attached to SAP Note 1514966 Sizing SAP In-Memory Database.

Answers B and C are incorrect because there is no sizing report for generic applications, and there is no Quick Sizer tool for AnyDB.

Question 7

Correct answer: A

Recommendation from SAP Note 1514966 (see the “Greenfield, Brownfield, and Expert Sizing” section).

Answer B is incorrect because, in practice, storage throughput in IOPS may provide a more important storage requirement than the actual disk space. However, the question asked about disk space requirements. Answer C is incorrect because the rule of thumb of three times RAM provides a rough calculation for the space requirements of an SAP HANA appliance as this also allows for backups and memory dumps (exports) on the same volume. For production systems, only database data files should be stored on the data volume.

Exam Tip: Formulas provide great material for exam questions. SAP Note 1514966 dates from 2013 when the TDI program was launched and may no longer be used all that much for actual sizing projects.

Question 8

Correct answer: C

 Although all listed technologies impact memory sizing indirectly by reducing the amount of memory required to store the tables, only persistent memory (PMEM) does so directly as reflected in a dedicated note. As mentioned in SAP Note 2786237 – Sizing SAP HANA with Persistent Memory, expert sizing is strongly recommended when implementing PMEM.

SAP HANA smart data access (SDA) uses linked databases, virtual tables, and query federation to access remote data as if it’s stored locally, without data copy (only metadata is stored). SDA was introduced with SAP HANA 1.0 SPS 06.

SAP HANA dynamic tiering is an optional component that extends the in-memory database with a disk-based columnar store. In the temperature analogy, hot data is stored in memory while less frequently accessed warm data is placed in the extensive store. SAP HANA dynamic tiering is an add-on component, separately licensed, with its own documentation set and listing on the PAM. It uses SAP IQ technology for the extended store and supports native SAP HANA applications but not SAP BW or enterprise resource planning (ERP) applications (SAP S/4HANA). SAP HANA dynamic tiering was introduced with SAP HANA 1.0 SPS 10.

SAP HANA extension nodes enable you to allocate a node (host) of a multiplehost, scale-out (distributed) system as warm data memory store. Extension nodes were introduced with SAP HANA 1.0 specific to SAP BW but are currently also supporting native applications. Other warm and cold data aging solutions specific to SAP BW are nearline storage and the “nonactive data” concept.

 Exam Tip: For this topic area, you’re not expected to be intimately familiar with all these data tiering solutions, but you should have a general understanding of what they are and how they are used.

Question 9

Correct answer: C

 To perform a comprehensive hardware configuration check, you must run the hardware configuration check tool with the appropriate configuration template files.

Answer A is incorrect because the SAP HANA installer calls the Python script and exits in case of noncompliance. However, this script only verifies the software prerequisites required for the installation and doesn’t perform a comprehensive check of hardware, network, and storage configurations. Answer B is incorrect because running the hardware configuration check tool isn’t required, and you could opt to perform all the recommendations and requirement check manually, although this isn’t the recommended approach. Answer D is incorrect because SAP strongly recommends using the hardware configuration check as part of an SAP GoingLive Check for SAP HANA, but the tool is freely available for download for every SAP HANA customer and documented in SAP Notes. To use the hardware configuration check tool, the SAP GoingLive Check service isn’t required.

Question 10

Correct answers: BC

 SLES and RHEL for SAP solutions are the operating systems supported for the SAP HANA server.

Answer A is incorrect because, although you may find blogs on the SAP Community about how to install SAP HANA, express edition, on Ubuntu, this Linux distribution is only supported for the SAP HANA client. Answer D is incorrect because openSUSE is the open source project for SUSE Linux and Leap at regular release, but there is no “for SAP solutions” edition as this only exists for SLES.

Question 11

Correct answer: A

 An internal network zone is used for system replication.

Answer B is incorrect because the storage network zone is only used to communicate between the system and storage subsystems (NAS or SAN) for persistence, log, and backups. Answer C is incorrect because local, global, and internal are valid configurations for the listen interface, but local isn’t a network zone.

Question 12

Correct answer: D

 XFS is the recommended file system by SAP.

Answer A is incorrect because the Microsoft Windows NT File System (NTFS) isn’t supported for SAP HANA as SAP HANA is only supported on Linux. Answer B is incorrect because the EXT4 file system isn’t supported. Answer C is incorrect because IBM Spectrum Scale (GPFS) is a clustered file system supported for SAP HANA.


In this post, you were able to practice a dozen questions on SAP HANA installation that you should know for the SAP HANA 2.0 Technology Associate certification exam. Best of luck on your test!

Editor’s note: This post has been adapted from a section of the book SAP HANA 2.0 Certification Guide: Technology Associate Exam by Denys van Kempen.


SAP HANA 2.0 Certification Guide: Technology Associate Exam

Preparing for your SAP HANA 2.0 technology associate exam? Make the grade with this certification study guide! From installation and configuration to monitoring and troubleshooting, this guide will review the key technical and functional knowledge you need to pass with flying colors. Explore test methodology, key concepts for each area, and practice questions and answers.

Source: SAP Press

Announcing – Free Masterclass – “Digital Marketing for small business owners” – Sat, 28th Jan 2023, 11:00 AM to 12:00 PM EST. Register Now!

Master Digital Marketing to contribute meaningfully to your Business!

Digital marketing is more important now than ever. Take steps to create, streamline, or optimize your digital marketing (also known as internet marketing) today and safeguard your business.

Register for a FREE Masterclass on ” Digital Marketing for small business owners” on Saturday, 28th Jan 2023, from 11:00 AM to 12:00 PM EST

Learn digital marketing for your business with IIBS. We specialize in developing innovative concepts and ideas to help skyrocket your business.

Register Free:

Whether you:
• run a small online business and handle your Digital Marketing on your own?
• or run a face-to-face business

You NEED to know Digital Marketing. Without which, you’re giving up on converting anyone who shops or looks for products or services online.
Our team consists of experienced professionals who understand Canada’s digital landscape and will work closely with you.

Digital marketing will help your business to:
• Create a customer persona.
• Seize growth opportunities and boost revenue for your Business.

With great delight, we invite you to join our Free Live masterclass.

Register for a FREE Masterclass on ” Digital Marketing for small business owners” on Saturday, 28th Jan 2023, from 11:00 AM to 12:00 PM EST

Register now:

The meeting details will be sent to confirm your spot.

You can make potential clients aware of your existence if you learn from professionals in the field, apply your learning to real-world projects, and receive support.

AWS Overview Training Confirmed to start Jan 29th 2023

Confirmed start date – Jan 29th   2023

Training Objectives

  • 5 weeks in class training by a certified instructor on weekends
  • Access to online mock exam
  • Training material is provided
  • Learning Management System Access (LMS) for Practice
  • Regular assignments & reviews along with internal assessment
  • Course Completion Certificate


  • Professionals who want to pursue a career in AWS Cloud Computing
  • System Administrators or Architects
  • AWS beginners without prior AWS experience
  • Programmers Interested in Deploying Applications on AWS
  • Experienced IT professionals

Classes can be repeated by the students within the same course for a low fee


For more details about the schedule call us at 905-268-0958 or email at

S/4 HANA procurement and Sourcing (SAP MM) Confirmed to start Jan 28th 2023

Confirmed start date – Jan 28th, 2023

The SAP MM (S/4HANA for Sourcing) certification course verifies that the candidate possesses knowledge and skills in the area of SAP Procurement to satisfy the requirements for the consultant profile so that the candidate can implement this knowledge practically in projects.

Training Objectives

 60 hours in class training conducted on weekends

  • Focused towards getting SAP S/4 HANA Sourcing and Procurement Certified Application Associate
  • Certified instructor with over 10 years of industry experience-Excellent Instructor Reviews
  • Training material is provided
  • Learning Management System Access (LMS) for Practice
  • Through grounding of S/4 HANA Sourcing Concepts
  • Regular assignments & reviews along with internal assessment
  • Individual Attention
  • Course Completion Certificate awarded to all candidates
  • Classes can be repeated by the students within the same course for a low fee


For more details about the schedule call us at 905-268-0958 or email at

SAP S/4 HANA Finance Projected to start Jan 29th 2023

Projected start date of the batch– Jan 29th  2023.  REGISTRATIONS OPEN NOW.

Companies need to integrate and upgrade to S/4 HANA resulting in huge demand for SAP finance professionals who are able to execute the successful and smooth transition from SAP R/3 to S/4 HANA. These upgrades mean that the demand for S/4 HANA professionals will continue to grow in the SAP job market.

Who can do this course

Candidates with basic knowledge of Finance

BI, HANA, Finance Consultants background,

IT professionals who want to make a career in SAP Finance working in Financial domain

SAP technical & functional professionals

Students & College Graduates with finance, computer science, business engineering degrees

IT professionals who are into other non-finance SAP modules

SAP Project Managers



Basic knowledge of Finance or any SAP Functional module


Trainer is SAP S/4 HANA certified with over 15 years deep functional & technical experiences in SAP Finance and Accounts. Get trained by a highly experienced and knowledgeable trainer.

Call us at 905-268-0958 or email at

6 Predictions About Data In 2020 And the Coming Decade

It’s difficult to make predictions, especially about the future. But one fairly safe prediction is that data will continue eating the world in 2020 and the coming decade. The most important tech trend since the 1990s will no doubt accentuate its presence in our lives, for better or for worse.

At the beginning of the last decade, IDC estimated that 1.2 zettabytes (1.2 trillion gigabytes) of new data were created in 2010, up from 0.8 zettabytes the year before. The amount of the newly created data in 2020 was predicted to grow 44X to reach 35 zettabytes (35 trillion gigabytes). Two years ago, we were already at 33 zettabytes, leading IDC to predict that in 2025, 175 zettabytes (175 trillion gigabytes) of new data will be created around the world.
The most important new tech development of the passing decade has been the practical success of deep learning (popularly known as “artificial intelligence” or “AI”), the sophisticated statistical analysis of lots and lots of data or what I have called Statistics on Steroids (SOS). In the coming decade, data will continue to beget data, to break boundaries, to drive innovation and profits, and create new challenges and concerns.

Faster networks will re-energize the data virtuous cycle

The constant increase in data processing speeds and bandwidth, the nonstop invention of new tools for creating, sharing, and consuming data, and the steady addition of new data creators and consumers around the world, ensure that data growth continues unabated. Data begets more data in a constant virtuous cycle.
But from time to time there is a specific tool or technology that act as a new catalyst. Over the last decade such catalysts were smart phones and social networks. Over the next few years, a new catalyst will be 5G networks, and by 2030, 6G networks with speeds of 1 terabyte per second. Internet delivered via satellites will play a similar role of accelerating the movement of data and reducing latency in the not-distant future.

There will be many new places for data to emerge and spread

Data fills all voids. Even after thirty-five years of enterprise “digital transformation” and twenty-five years since the big data big bang (i.e., the Web, popularly known as “the internet”), there are still a few billion unconnected people and industries that are still primarily analog. The people of Asia and Africa are correcting the former lacuna and sectors such as agriculture, healthcare, and education provide the missing pieces for the latter. Other dominant (and relatively new) sources of data creation and consumption will be things such as sensors that enable new locations for enterprises for collecting and even processing and analyzing data and things that move such as automobiles.
In addition, enterprises will accelerate their shift from focusing on managing (collecting, storing, analyzing) internal data to investing the greater part of their IT resources in managing (collecting, storing, analyzing) external data, most of it “unstructured,” with a significant share of this data emerging from new audio and video sources.

Synthetic data will add a new dimension to data growth

Deep learning is brute force AI. Unlike Deep Blue, however, its brute force is derived (primarily) from lots and lots of data rather than lots and lots of processing power. Unlike Google and Facebook and Amazon, however, most enterprises do not have lots of data (relatively speaking) in their data centers. Their solution will be to take their own (meager) volumes of data and synthesize it to create the amount of data required for training their algorithms and validating their models. Synthetic data will graduate from its role as a sub-set of anonymized data to play a new one in the training of deep learning algorithms in data-challenged enterprises.

The business of data will become a significant sector of the global economy

Google and Facebook have led the way in showing the world you can build a very large and profitable business based solely on the collection and analysis of data. But their revenues are derived almost exclusively from serving as an advertising platform and the value of their data to their business is measured in relations to advertising efficiency and effectiveness. In the coming decade we will see more and more businesses, some growing quite large, whose sole business is data as an asset that has a defined intrinsic value and can be bought, sold, serviced, and added (as a distinct component) to other products and services. This will also be true for many enterprises that will come up with metrics to measure the value of data to their business and ways to “monetize” it, i.e., make it a distinct revenue stream. And the sub-sector of the data economy known as cyber crime will continue to grow by leaps and bounds.

The most successful and well-paying jobs will be data-related

From data preparation to perfecting data analysis models, the most important jobs of this coming decade will be related to data, its management and protection, its governance and monetization, its analysis and role in decision-making. In addition to the establishment and proliferation of data-related jobs, “data literacy” will become a major focus of internal training for all employees in many enterprises. Almost all products and services will either be based on data or will have a data component and their developers, managers, and sellers will have to be data proficient.

We will continue to trade our data privacy for convenience, entertainment, and feeling connected

Regardless of new government-mandated data privacy policies, our deliberate or unwitting or forced data nakedness will ensure the continuation—and possibly acceleration—of the data mining and monetization by enterprises and government agencies (such as DMVs in the US). “Other people’s money” will be dwarfed by “other people’s data.”
In the coming decade, buzzwords will come and go, but data—its growth, analysis, and use—will be the most significant and consistent tech trend. Ones and zeros will continue eating the world.

Easy Ways to Boost Your Mobile App Testing Skills

The mobile application testing continues to be an integral part of the job of almost any skilled QA Engineer. Most professionals use a wide range of advanced utilities and mobile testing tools for making the testing process as precise as possible. Is there any universal software for application testing most companies use? What is the best way to improve QA management in your company? We’ve collected a number of must-have solutions for making the testing process easier, deeper, and much more effective.

Top Tools That Will Improve Your Testing

Emulator and simulator software. These are incredibly handy solutions for those testers, who have a limited number of devices. You don’t need to obtain all iPhone versions to test the app on iOS gadgets – professional emulator is always ready to help. The most popular programs for making UX/UI testing and other types of tests for various types of devices are XCode, AVD, and Genymotion. However, these are the most expensive solutions compared to crowd testing services, real devices and real people.

Note: Although testing on virtual devices can provide you with tons of useful information, using physical devices is still preferable. This way, using the services of a crowd testing company might be a great option.

Network configuration tools. The performance of modern devices is tightly connected to the type of network they use. Therefore, it is important to have special tools that will allow you to configure its settings. Network configuration tools will be exceptionally useful in case you need to imitate a bad Internet connection or lost data packets. Network Link Conditioner is likely to become the best solution for making advanced network settings for most types of gadgets. If you are looking for proxy testing tools, it might be a good idea to try Charles or Fiddler. By the way, don’t forget about API testing utilities: Postman, Insomnia, Paw, and others.

Automation testing solutions.
Most experienced testers will tell you that Selenium is the leading program for performing automation testing. However, this is not the only solution for making tests in a fast, reliable, and top-notch way. In case you are new to automation testing, it is recommended to begin with Appium. This application is more user-friendly to beginners and will help you learn the basics of performing the automated tests. Among the other good alternatives, there are Kif, Robotium, and Frank.

In case you need a tool, focused on mobile operating systems, consider choosing XCUITest for iOS devices and Espresso for Android. These two are easy-to-use and configure that is highly valued by most beginners

Manual testing. This is the process of manually testing software for defects and it should not be ignored, especially in digital testing. By playing the role of the end user and using most of the features with real devices, under real life conditions – you can provide extremely valuable feedback. Manual testing should be the first thing to test on a new product (before doing any automation) and it’s a necessary step to check the automation testing feasibility. Manual testing does not require any special tools, but it’s recommended to use good reporting tools such as Jira, YouTrack, RedMine, Ubertesters and many more.

Exploratory testing. Writing standard test cases is good but not perfect for some projects. Therefore, some expert QA Engineers perform exploratory testing aside from classical test cases to better understand the app’s weaknesses. With exploratory testing, you can find bugs and crashes in completely unexpected places and conditions. The exploratory testing is one of the most important types of testing services. As a rule, you don’t need to do anything to order this kind of service – the entire testing job is done by professional QA Engineers.

Beta testing tools. How will a certain app behave when you install it from the store? Will everything be okay in case you decide to update it from the current version? To check these features, you will surely require some professional tools for beta testing. Test Flight is one of the greatest apps for these purposes. Another alternative is the Ubertesters beta testing management platform. The platform allows you to also distribute your builds among testers, get crash reports, video recording of the test sessions, manage the testers and much more.

Note: beta testing tools such as TestFlight usually require having a certificate for the app, so don’t hurry up to use them in the early stages of your project. The Ubertesters platform does not require to pre-approve the app in the store, thus, it might be a better alternative for build distribution and a beta testing tool.

Alpha testing tools. Almost any test case requires using Alpha testing. The truth is that this type of testing is really useful before the release and on the early stages of development. If you don’t know where to start, it is recommended to try the most popular solutions that are widely used by most experienced testers. AppBlade, Ubertesters, Google FireBase, and HockeyApp are among the greatest options for Alpha testing.

Application analysis software. These tools are great for reporting bugs and crashes with the help of simple and comprehensible charts and diagrams. Different tools have various advanced functions, so it is better to test a few programs to find the best one to suit your needs. AppSee, CrashLytics, and FireBase are among the leading application analysis tools that are available on the market. Each one is unique and focused on different testing areas.

Pro Tips for Better Testing Results

SAP Data Intelligence: Enterprise AI Meets Intelligent Information Management

Artificial Intelligence has struggled to live up to the hype of recent years.

If you were to believe the buzz, AI would be responsible for automagically solving all our biggest problems with complex computer wizardry and granting all of us a life of leisure and simplicity. It reminds me of the Hitchhiker’s Guide to the Galaxy, in which hyper-intelligent beings design a computer to reveal the answer to the meaning of life, the universe, and everything–only to find out that the answer is 42, and they never knew what the original question was anyways.

At the same time, Information Management approaches have failed to keep pace with technological change. Most of this technology was built and designed for the days of on-premise applications that wrote to on-premise databases, where the goal was to extract data and load it into a data warehouse for BI and reporting. While that need still exists, the data that we manage and the ways we extract value from that data have all radically shifted and diversified.

We are left with a complex mix of structured, unstructured, and object store data residing in a blend of cloud and on-premise systems, with access often being limited or non-standardized via APIs. The result is a complicated landscape of data sprawl, tooling diversification, and data siloes. All of this leads to an increasing inability to “locate the wisdom we have lost in knowledge” and the “knowledge we have lost in information” (all credit where it is due to TS Elliott).

Where Traditional AI and Information Management Fail

The combination of this failure of AI and Information Management can be seen in a few data points:

• 86% of enterprises claim that they are not getting the most out of their data
• 5 out of 10 early data science initiatives fail to get to production
• 74% say their data landscape is so complex that it limits agility

And perhaps most telling: 2/3rds of businesses consider machine learning and AI important business initiatives but only 1/3 or less are confident in their ability to implement them.

Unlocking the Promise of Enterprise AI

This is why we have developed an entirely new solution from the ground up, with open source and cloud principles in mind, to ask how you tackle these challenges in order to unlock the true promise of Enterprise AI and achieve Data Intelligence. Data Intelligence is what happens when you bring together both halves of the equation: managing your data wherever (and whatever) it is, and then extracting value from that data using the latest tools and techniques.

3 Ways Artificial Intelligence Is Uprooting Sales

In 2015, Forrester caused a storm to brew when it announced that artificial intelligence (AI) would replace one million B2B sales jobs by 2020. This bold headline, however, failed to capture the entire picture. Sure, if sales reps continue to rely on age-old practices like cold calling and distributing spray-and-pray marketing collateral, their days are surely numbered. Yet, on the other hand, artificial intelligence has failed to live up to business expectations. Case in point: according to a recent white paper by Pactera Technologies and Nimdzi Insights, 85% of artificial intelligence projects fail to deliver on their intended promises to business. Artificial intelligence and human sales reps are not mutually exclusive entities. If sales reps adapt and exploit the ever-increasing capabilities of AI, they seek to gain from the emergence of AI.

Automating repetitive tasks

The majority of a sales rep’s time (63%) is consumed by non-revenue-generating activities. AI has enormous potential to free up sales reps’ time so that they can focus more effectively on selling, building relationships, and closing deals.

According to McKinsey, about half of a sales rep’s workload consists of activities that can be automated by AI. Consider, for example, time management and scheduling. Less than one third (28%) of sales reps adhere to a structured time management methodology. AI-powered scheduling and calendaring solutions go a long way in terms of transforming time management into time intelligence. Woven, for example, is an AI-powered calendar app created by Tim Campos, the former CIO of Facebook. Woven uses natural language processing to scan users’ email inboxes for signs of meeting requests. Its virtual assistant then generates suggested times to meet and sends emails to attendees to select a time option. The app even uses location data to account for travel time between meeting destinations.

Taking it one step further, it’s not all that hard to conceive of an app that gives sales reps recommendations as to how they should prioritize their days, depending on their chronotype.

In addition to scheduling, sales reps squander hours each day on email. The majority of sales reps’ time is spent on sales technology (62.8%), with sales-related email ravaging most of their time (33.2%). AI-powered apps can liberate sales reps from living in their inboxes. Crystal Knows, for example, uses AI and natural language processing to predict customers’ personalities and, in turn, create personalized email templates that will garner the best responses. It offers sales reps recommendations for specific language and phrasing, thereby saving them a lot of time scribing emails from scratch.

Identifying the best leads
Lead scoring is at the heart of any successful demand generation strategy. Enhancing lead scoring capabilities is top-of-mind for sales and marketing professionals alike. While lead scoring methods have become more refined, we’ve only scratched the surface. Only 17% of organizations rate their lead scoring initiatives as highly effective. Today, most leads that are passed from marketing to sales are of decent quality, but are not sales-ready. According to research by Demand Gen, an eye-popping 70% of marketing executives believe that the leads passed to sales are of decent quality, but many are not sales-ready. The result is that sales outreach is subpar. This speaks to why 50% of sales time today is spent on unproductive prospecting.

Enter AI. AI can monitor an arsenal of different signals to predict a specific lead’s readiness to purchase. Research by Gleanster Research reveals that half of leads are qualified, but not yet ready to buy. AI can unearth the lucrative sales-ready leads. B2B consumers are using more channels to engage with vendors than ever before—from review sites to social media platforms to online communities. AI can mine these platforms for buying signals, couple them with demographic, firmographic, and technographic information, and pinpoint which leads are sales-ready. It can account for nuances such as sentiment to predict buying propensity. In an ideal world, AI allows sales reps to transition from predictive to prescriptive selling by isolating why a lead is a particularly good fit.

Less than half of sales reps have data insights on customers’ propensities to buy. Yet, according to Harvard Business Review, companies that use AI for sales are able to increase their leads by 50%. AI helps eliminate the guesswork and empowers sales reps to focus their time most productively.

Enhancing customer relationships
A close read of Forrester’s report reveals that AI will affect different types of sales professionals differently. For “order takers”, who process customer orders that could be filled via self-serve channels, for “explainers”, who provide buyers with information about complex products, and for “navigators”, who help buyers understand what their companies need to purchase, job loss will be 33%, 25%, and 15%, respectively. But, for “consultants”, who help buyers understand what they need to purchase and who have vast knowledge about the buyer’s company, there won’t be any job loss. In fact, this subset will witness a 10% gain in available jobs.

The sales reps of the future will be a different breed compared to their ancestors. They will assume the essential role of consultants and advisors, leveraging AI to gain the trust and favor of customers. 79% of business buyers say it’s very important or absolutely critical to engage with a salesperson who is a trusted advisor and who adds value to their business. With a deeper understanding of customers’ needs, sales reps will be able to have more relevant and engaging conversations with customers. With knowledge of customers’ pain points, their reasons for buying, what obstacles need to be overcome, and which decision-makers are at the table, sales reps can creatively solve complex business problems that customers face.

We’ve come a long way since the term “artificial intelligence” was coined in 1955. Only in our current era has the sales profession started to realize the potential of AI. Contrary to some media headlines, AI will never uproot sales professionals entirely. The sales professionals of the future will work in tandem with AI, exploiting—and embracing—its capabilities to acquire new superpowers. Businesses that combine AI with human insights witness a 66% boost in productivity and a 61% increase in customer satisfaction, according to research by Forrester. The key is the marriage between AI’s IQ and humans’ EQ.