Which Software Testing Career Path and Certification is Right for You?

June 28th, 2017 by blogadmin No comments »

Are you wondering which ISTQB certification is right for you? The following is a short explanation of the various certifications and the way you might want to proceed based on your career goals. The tricky thing in our industry is the constant change. The skills of today may or may not be marketable tomorrow, so while you’re thinking about what you want to do, you should also consider what you need to do to be able to get or retain the job you want.

Foundation Level – This is where you need to start. This is the gateway certification for all the other ISTQB certifications. This level is designed for beginners all the way up to those who have been in the industry for a while (or maybe a long while) and need to brush up their skills and update their terminology. One of the fastest ways to fail in an interview is to use the wrong terms for testing processes, documents and techniques. Organizations tend to adopt their own terminology and it helps to have a base of “standard” terminology, particularly before you venture out for an interview.

The Foundation Level is in the process of being expanded to include several extension modules. Right now, the agile extension is due to be available in early 2014 and work is starting on the model-based testing extension. These are separate certifications you can get that are “added on” to your Foundation Level certification.

Advanced Level – This is where you need to start making decisions. What do you like to do? What do you want to do? Where are the most opportunities?

Advanced Level – Test Analyst – If you are not very technically minded, and would rather work with the user, know the business application and apply your skills more for analysis than programming, you want to pursue the Advanced Test Analyst certification. This certification is designed for the strong tester who has a deep understanding of the business domain and the needs of the user. You’ll learn about designing good test documentation, conducting effective reviews, and participating in risk analysis sessions (particularly to help determine the impact of a realized risk to the business user). You’ll learn about how you can contribute the test information (input data, action, expected results) to the test automation effort and you’ll learn about usability testing. You’ll also build upon the test techniques you learned at the Foundation Level and will learn new techniques such as domain analysis and cause-effect graphing, as well as how to test using use cases and user stories. You’ll learn more about defect-based and experience-based techniques so you’ll know how to pick an appropriate defect taxonomy and how to implement traceable and reproducible exploratory and checklist-based testing. Let’s not forget process improvement as well. You’ll learn what to track in your defect management to be sure you have the information to figure out what could be improved in your process and how you can do it. This certification is designed for the person who wants to spend their time testing, not programming or delving into the code or troubleshooting technical issues.
The path for the Advanced Test Analyst at the Expert Level will include a further specialization in usability testing and further development of testing techniques. At this point, these new syllabi are being discussed but will not be available until at least 2015.

Advanced Level – Technical Test Analyst – OK, admit it, you really like to play in the code. You like to review it, program tests to test it and create test automation and tools. If this is describing you, you definitely need to be looking at the Advanced Technical Test Analyst certification. This certification is designed for the technically minded individual who wants to and is capable of programming, both in scripting languages (e.g., python) as well as standard programming languages (e.g., java). You’ll learn how to approach white-box testing to find the difficult problems that are often missed by the black-box testing that is usually done by the Test Analyst. You will learn strong testing techniques that will allow you to systematically test decision logic, APIs and code paths. You will also learn about static and dynamic analysis techniques and tools (stamp out those memory leaks!). You will learn about testing for the technical quality characteristics such as efficiency (performance), security, reliability, maintainability, and portability. You’ll learn how to do effective code and architectural reviews. And, you’ll learn about tools – using tools, making tools, and a little about selecting the right tools. After all, you wouldn’t want to accidentally get a tool that creates code mutants (really, that’s a legitimate tool usage) when you really wanted a simulator. And did I mention automation? You will learn the basis for automation that will be built on at the Expert Level.

The Advanced Technical Test Analyst certification is the gateway to the Expert Level for Test Automation (Engineering) and Security. The Test Automation (Engineering) syllabus and the Security syllabus and their associated certifications are likely to be available in 2014 or early 2015.

Advanced Test Manager – Those who can, do, and those who can’t, manage? Well, that’s not usually a successful formula for a test manager. If you are a test manager or want to be one, and you are willing to learn all the necessary techniques and practices to be successful, then this certification is the one for you. You will learn all about test planning, monitoring and controlling for projects but you will also learn about establishing test strategies and policies that can change the course of testing for the organization. You will learn about how to effectively manage, both people and projects, and will learn the importance and application of metrics and estimation techniques. You will learn your role in reviews. You will learn how to effectively manage defects and how to focus on improving the test process. You will also learn the importance and proper usage of tools and be able to set realistic expectations regarding tool usage. So, if you like telling people what to do, and they tend to listen to you, this is probably the right certification for you. However, that said, remember that technical people respect technical people, so rather than just getting the Advanced Test Manager certification, you should think about also getting at least the Advanced Test Analyst certification as well.
The Advanced Test Manager certification is the gateway to the Expert Levels for Improving the Test Process and Test Management. The Expert Level Improving the Test Process certification focuses on various techniques and models that are used for test process improvement. This provides a good coverage of the most popular models and provides information regarding how to approach an improvement effort to net an effective result. The Expert Level Test Management certification focuses on honing those strategic, operational and personnel skills to make you the best test manager you can be. There is significant discussion in the syllabus about how to be sure your department is performing well and is receiving the accolades it deserves. There is also realistic information regarding managing people effectively and dealing with difficult situations.

The Advanced Test Manager certification is also a pre-requisite for the management part of the Expert Level Test Automation certification. This focuses on how to effectively manage an automation project, including getting the right tools, resources, budget and timeframe. This syllabus should be available in late 2014 or early 2015.

Which Way to Go?
It’s entirely up to you. As you can see, there are several ways you can go with the certification path. And remember, for example, you might not want to get the Advanced Technical Test Analyst certification if you are a test manager, but you can always read the free syllabus and learn something even without a big time investment. They make for interesting reading, even if you are not planning the particular career path that is indicated. Our industry is constantly changing and new syllabi are always in the works. If you plan to head for the Expert Level, it’s a good idea to start planning your path now as that may determine which Advanced certification(s) you will need. Keep an eye on the ISTQB web site for new additions to the syllabus family. And remember to train, not just for your current job, but for the next job you want to get. Right now, the job market is hot for those with the skills of the Advanced Technical Test Analyst. There is always a need for good test managers. Note the emphasis on the word “good”. And, many companies want Advanced Test Analyst’s as well because of the need for black-box testing and strong domain knowledge. Right now, the biggest growth in in the Advanced Technical Test Analyst area, but that can change quickly. Get your training now, so you’ll be ready.

It’s unlikely that we will run out of work anytime in the future because, as long as there are developers, there will be a need for testers. It’s built in job security! Plan and train for your future. It’s looking bright!

Source: All the above opinions are personal perspective on the basis of information provided by CSTB


Deep Learning vs. Machine Learning-The essential differences you need to know!

June 8th, 2017 by blogadmin No comments »


Machine learning and deep learning on a rage! All of a sudden everyone is talking about them – irrespective of whether they understand the differences or not! Whether you have been actively following data science or not – you would have heard these terms.

Just to show you the kind of attention they are getting, here is the Google trend for these keywords:


If you have often wondered to yourself what is the difference between machine learning and deep learning, read on to find out a detailed comparison in simple layman language. We have explained each of these terms in detail.

 Table of Contents

  1. What is Machine Learning and Deep Learning?
    1. What is Machine Learning?
    2. What is Deep Learning?
  2. Comparison of Machine Learning and Deep Learning
    1. Data Dependencies
    2. Hardware Dependency
    3. Problem Solving Approach
    4. Feature Engineering
    5. Execution time
    6. Interpretability
  3. Where is Machine Learning and Deep Learning being applied right now?
  4. Pop Quiz
  5. Future Trends
  6. What is Machine Learning and Deep Learning?

Let us start with the basics – What is Machine Learning and What is Deep Learning. If you already know this, feel free to move to section 2.

 1.1 What is Machine Learning?

The widely-quoted definition of Machine learning by Tom Mitchell best explains machine learning in a nutshell. Here’s what it says:

“A computer program is said to learn from experience E with respect to some class of tasks T and performance measure P if its performance at tasks in T, as measured by P, improves with experience E ”

Did that sound puzzling or confusing? Let’s break this down with simple examples.

Example 1 – Machine Learning – Predicting weights based on height

Let us say you want to create a system which tells expected weight based on height of a person. There could be several reasons why something like this could be of interest. You can use this to filter out any possible frauds or data capturing errors. The first thing you do is collect data. Let us say this is how your data looks like:


Each point on the graph represents one data point. To start with we can draw a simple line to predict weight based on height. For example a simple line:

Weight (in kg) = Height (in cm) – 100

can help us make predictions. While the line does a decent job, we need to understand its performance. In this case, we can say that we want to reduce the difference between the Predictions and actuals. That is our way to measure performance.

Further, the more data points we collect (Experience), the better will our model become. We can also improve our model by adding more variables (e.g. Gender) and creating different prediction lines for them.

Example 2 – Storm prediction System

Let us take slightly more complex example. Suppose you are building a storm prediction system. You are given the data of all the storms which have occurred in the past, along with the weather conditions three months before the occurrence of these storms.

Consider this, if we were to manually build a storm prediction system, what do we have to do?


We have to first scour through all the data and find patterns in this data. Our task is to search which conditions lead to a storm.

We can either model conditions like – if the temperature is greater than 40-degree Celsius, humidity is in the range 80 to 100, etc. And feed these ‘features’ manually to our system.

Or else, we can make our system understand from the data what will be the appropriate values for these features.

Now to find these values, you would go through all the previous data and try to predict if there will be a storm or not. Based on the values of the features set by our system, we evaluate how the system performs, viz how many times the system correctly predicts the occurrence of a storm. We can further iterate the above step multiple times, giving performance as feedback to our system.

Let’s take our formal definition and try to define our storm prediction system: Our task ‘T’ here is to find what are the atmospheric conditions that would set off a storm. Performance ‘P’ would be, of all the conditions provided to the system, how many times will it correctly predict a storm. And experience ‘E’ would be the reiterations of our system.

 1.2 What is Deep Learning?

The concept of deep learning is not new. It has been around for a couple of years now. But nowadays with all the hype, deep learning is getting more attention. As we did in Machine Learning, we will look at a formal definition of Deep Learning and then break it down with example.

“Deep learning is a particular kind of machine learning that achieves great power and flexibility by learning to represent the world as nested hierarchy of concepts, with each concept defined in relation to simpler concepts, and more abstract representations computed in terms of less abstract ones.”

Now – that one would be confusing. Let us break it with simple example.

Example 1 – Shape detection

Let me start with a simple example which explains how things happen at a conceptual level. Let us try and understand how we recognize a square from other shapes.


The first thing our eyes do is check whether there are 4 lines associated with a figure or not (simple concept). If we find 4 lines, we further check, if they are connected, closed, perpendicular and that they are equal as well (nested hierarchy of concept).

So, we took a complex task (identifying a square) and broke it in simple less abstract tasks. Deep Learning essentially does this at a large scale.

Example 2 – Cat vs. Dog

Let’s take an example of an animal recognizer, where our system has to recognize whether the given image is of a cat or a dog.


If we solve this as a typical machine learning problem, we will define features such as if the animal has whiskers or not, if the animal has ears & if yes, then if they are pointed. In short, we will define the facial features and let the system identify which features are more important in classifying a particular animal.

Now, deep learning takes this one step ahead. Deep learning automatically finds out the features which are important for classification, where in Machine Learning we had to manually give the features. Deep learning works as follows:

Deep learning works as follows:

  • It first identifies what are the edges that are most relevant to find out a Cat or a Dog
  • It then builds on this hierarchically to find what combination of shapes and edges we can find. For example, whether whiskers are present, or whether ears are present, etc.
  • After consecutive hierarchical identification of complex concepts, it then decides which of this features are responsible for finding the answer.
  1. Comparison of Machine Learning and Deep Learning

Now that you have understood an overview of Machine Learning and Deep Learning, we will take a few important points and compare the two techniques.

 2.1 Data dependencies

The most important difference between deep learning and traditional machine learning is its performance as the scale of data increases. When the data is small, deep learning algorithms don’t perform that well. This is because deep learning algorithms need a large amount of data to understand it perfectly. On the other hand, traditional machine learning algorithms with their handcrafted rules prevail in this scenario. Below image summarizes this fact.


2.2 Hardware dependencies

Deep learning algorithms heavily depend on high-end machines, contrary to traditional machine learning algorithms, which can work on low-end machines. This is because the requirements of deep learning algorithm include GPUs which are an integral part of its working. Deep learning algorithms inherently do a large amount of matrix multiplication operations. These operations can be efficiently optimized using a GPU because GPU is built for this purpose.

2.3 Feature engineering

Feature engineering is a process of putting domain knowledge into the creation of feature extractors to reduce the complexity of the data and make patterns more visible to learning algorithms to work. This process is difficult and expensive in terms of time and expertise.

In Machine learning, most of the applied features need to be identified by an expert and then hand-coded as per the domain and data type.

For example, features can be pixel values, shape, textures, position and orientation. The performance of most of the Machine Learning algorithm depends on how accurately the features are identified and extracted.

Deep learning algorithms try to learn high-level features from data. This is a very distinctive part of Deep Learning and a major step ahead of traditional Machine Learning. Therefore, deep learning reduces the task of developing new feature extractor for every problem. Like, Convolutional NN will try to learn low-level features such as edges and lines in early layers then parts of faces of people and then high-level representation of a face.


2.4 Problem Solving approach

When solving a problem using traditional machine learning algorithm, it is generally recommended to break the problem down into different parts, solve them individually and combine them to get the result. Deep learning in contrast advocates to solve the problem end-to-end.

Let’s take an example to understand this.

Suppose you have a task of multiple object detection. The task is to identify what is the object and where is it present in the image.


In a typical machine learning approach, you would divide the problem into two steps, object detection and object recognition. First, you would use a bounding box detection algorithm like grabcut, to skim through the image and find all the possible objects. Then of all the recognized objects, you would then use object recognition algorithm like SVM with HOG to recognize relevant objects.

On the contrary, in deep learning approach, you would do the process end-to-end. For example, in a YOLO net (which is a type of deep learning algorithm), you would pass in an image, and it would give out the location along with the name of object.

 2.5 Execution time

Usually, a deep learning algorithm takes a long time to train. This is because there are so many parameters in a deep learning algorithm that training them takes longer than usual. State of the art deep learning algorithm ResNet takes about two weeks to train completely from scratch. Whereas machine learning comparatively takes much less time to train, ranging from a few seconds to a few hours.

This is turn is completely reversed on testing time. At test time, deep learning algorithm takes much less time to run. Whereas, if you compare it with k-nearest neighbors (a type of machine learning algorithm), test time increases on increasing the size of data. Although this is not applicable on all machine learning algorithms, as some of them have small testing times too.

2.6 Interpretability

Last but not the least, we have interpretability as a factor for comparison of machine learning and deep learning. This factor is the main reason deep learning is still thought 10 times before its use in industry.

Let’s take an example. Suppose we use deep learning to give automated scoring to essays. The performance it gives in scoring is quite excellent and is near human performance. But there’s is an issue. It does not reveal why it has given that score. Indeed mathematically you can find out which nodes of a deep neural network were activated, but we don’t know what there neurons were supposed to model and what these layers of neurons were doing collectively. So we fail to interpret the results.

On the other hand, machine learning algorithms like decision trees give us crisp rules as to why it chose what it chose, so it is particularly easy to interpret the reasoning behind it. Therefore, algorithms like decision trees and linear/logistic regression are primarily used in industry for interpret ability

3.Where is Machine Learning and Deep Learning being applied right now?

The wiki article gives an overview of all the domains where machine learning has been applied. These include:

  • Computer Vision: for applications like vehicle number plate identification and facial recognition.
  • Information Retrieval: for applications like search engines, both text search, and image search.
  • Marketing: for applications like automated email marketing, target identification
  • Medical Diagnosis: for applications like cancer identification, anomaly detection
  • Natural Language Processing: for applications like sentiment analysis, photo tagging
  • Online Advertising, etc


The image given above aptly summarizes the applications areas of machine learning. Although it covers broader topic of machine intelligence as a whole.

One prime example of a company using machine learning / deep learning is Google.


In the above image, you can see how Google is applying machine learning in its various products. Applications of Machine Learning/Deep Learning are endless, you just have to look at the right opportunity!

4.Pop quiz

To assess if you really understood the difference, we will do a quiz. You can post your answers in this thread.

Please mention the steps below to completely answer it.

  • How would you solve the below problem using Machine learning?
  • How would you solve the below problem using Deep learning?
  • Conclusion: Which is a better approach?

Scenario 1:

You have to build a software component for self-driving car. The system you build should take in the raw pixel data from cameras and predict what would be the angle by which you should steer your car wheel.

Scenario 2:

Given a person’s credentials and background information, your system should assess whether a person should be eligible for a loan grant.

Scenario 3:

You have to create a system that can translate a message written in Russian to Hindi so that a Russian delegate can address the local masses.

5.Future Trends

The above article would have given you an overview of Machine Learning and Deep Learning and the difference between them. In this section, I’m sharing my viewies on how Machine Learning and Deep Learning would progress in the future.

  • First of all, seeing the increasing trend of using data science and machine learning in the industry, it will become increasing important for each company who wants to survive to inculcate Machine Learning in their business. Also, each and every individual would be expected to know the basics terminologies.
  • Deep learning is surprising us each and every day, and will continue to do so in the near future. This is because Deep Learning is proving to be one of the best technique to be discovered with state-of-the-art performances.
  • Research is continuous in Machine Learning and Deep Learning. But unlike in previous years, where research was limited to academia, research in Machine Learning and Deep Learning is exploding in both industry and academia. And with more funds available than ever before, it is more likely to be a keynote in human development overall.

I personally follow these trends closely. I generally get a scoop from Machine Learning/Deep Learning newsletters, which keep me updated with recent happenings. Along with this, I follow arxiv papers and their respective code, which are published every day.

End notes

In this article, we had a high-level overview and comparison between deep learning and machine learning techniques. I hope I could motivate you to learn further in machine learning and deep learning.  Here are the learning path for machine learning & deep learning Learning path for machine learning and Learning path for deep learning.

Source: All the above opinions are personal perspective on the basis of information provided by Analytics Vidhya


SAP Cash Application: Intelligent and Integrated Payment Clearing Automation for SAP S/4HANA powered by SAP Leonardo Machine Learning

May 23rd, 2017 by blogadmin No comments »

SAP Cash Application: Intelligent and Integrated Payment Clearing Automation for SAP S/4HANA powered by SAP Leonardo Machine Learning

With its promise for new levels of automation and employee productivity, artificial intelligence (AI) is one of the hottest topics in the market.

But unless you cut through the hype, it is sometimes hard to understand whether you can really apply these concepts to your everyday business processes – especially if you cannot afford substantial technology investments or scores of data scientists. For example, if you are in corporate finance or shared services, you know you could benefit from a combination of different automation scenarios for accounts payable (AP) and accounts receivable (AR) processes. But how can you get started with AI, if you do not have the right expertise in-house?

Gaining Higher Efficiency and Improving Working Capital Metrics

This novel cloud service delivers an advanced level of automation for the clearing of incoming payments in SAP S/4HANA. It uses machine learning to match incoming electronic bank payments to open receivables. SAP Cash Application either automatically clears incoming payments or suggests a short list of possible clearing matches that your employees can quickly investigate. With this level of automation, you not only gain efficiency as your AR team can handle higher transaction loads, but also improve your days sales outstanding (DSO) and working capital metrics. With machine learning, it is easy to reach high automation rates without the need of a hand-tuned system.

Advancing Automation and Reducing Maintenance and Costs at Alpiq

Another major advantage of using machine learning in your payment clearing process is that your application seamlessly adapts to changing conditions, as it constantly learns from new data patterns and actions your AR team takes to match exceptions. This aspect is extremely important for Alpiq, one of our early adopters of this solution. Alpiq started a co-innovation project with SAP, because they wanted a solution that could effectively automate the payment clearing process with minimum maintenance and implementation costs.

Before working with SAP, Alpiq had used a traditional rule-based approach to automating its payment clearing process. But, with constant format changes and the addition of new payment methods, maintaining the rules had quickly become a challenge. As a result of the co-innovation project, Alpiq is confident it will be able to rely on a single integrated environment that learns from accountants’ behavior and leverages both historical data and existing AR workflows – with minimal maintenance required. Most importantly, the company is looking at automation rates of over 92 percent, enabling their shared service teams to process higher transaction volumes, focus on strategic tasks, and scale with the business.

Easy Access and Consumption

If you are considering increasing automation in finance, SAP Cash Application is a great way to start. As a SAP Leonardo cloud service, it can be instantly provisioned, and it automatically works with your SAP S/4HANA implementation. It allows you to take a pragmatic approach to innovation since you can start with a well-defined process that you can monitor and measure. This is important whether you are approaching automation for the first time or, like Alpiq, you want to modernize your automation strategy to lower costs and increase efficiency.

About SAP

As market leader in enterprise application software, SAP (NYSE: SAP) helps companies of all sizes and industries run better. From back office to boardroom, warehouse to storefront, desktop to mobile device – SAP empowers people and organizations to work together more efficiently and use business insight more effectively to stay ahead of the competition. SAP applications and services enable more than 345,000 business and public sector customers to operate profitably, adapt continuously, and grow sustainably. For more information, visit www.sap.com.


Tags: AIAlpiqartificial intelligencemachine learningSAP Cash ApplicationSAP LeonardoSAP S/4HANASAPPHIRE NOW

Source: All the above are personal perspective on the basis of information provided by SAP on SAP Cash Application





May 12th, 2017 by blogadmin No comments »

“The Report Global Software Testing Market 2017-2021 provides information on pricing, market analysis, shares, forecast, and company profiles for key industry participants. – MarketResearchReports.biz”

The global software testing market is poised to exhibit a strong CAGR of 14% from 2017 to 2021, according to a report recently added to the growing repository of MarketResearchReports.biz. Titled “Global Software Testing Market 2017-2021”, the study highlights the current scenario in the market and its future trajectory. It answers some of the pertinent questions related to the software testing market, such as what is the current and projected size of the market, what are the major challenges and trends that are likely to impact the market, and what are the outcomes of the five forces analysis.

The report states that one of the most prominent factors boosting the uptake of software testing services is the surge in test automation services and agile testing services. Companies have been adopting these services to improve the quality of cloud infrastructure and to put into practice newer methodologies for software testing services. The market is also fueled by the rising pressure on software providers to offer business as well as product value.

Technology-wise, the market is bifurcated into product testing and application testing. The latter is not only the largest segment but is also poised to register the highest CAGR over the course of the forecast period. Application testing may include a range of services: mobile application testing, new offers testing, security testing, and functional as well as non-functional testing. In addition to this, the segment is fueled by the increasing demand for enterprise mobility.

In terms of end use, software testing services are in demand in sectors such as telecom, media, banking, financial services, and insurance (BFSI), IT, and retail. The BFSI sector accounts for the leading share owing to the rising need to help customers access financial services on the go.

Get Sample Copy Of this Report @

By way of geography, the worldwide market for software testing is segmented into Asia Pacific, Europe, the Middle East, and Africa, and the Americas. Currently, the Americas hold the largest share and this regional segment is slated to continue its dominance over the global market throughout the forecast period. Within the Americas, the banking and telecom industries are the main end users of software testing services, which can be attributed to growing consumerization of location-based applications and data services. The soaring adoption of and demand for cloud services is also sure to boost software testing market in the Americas.

View Press Release @ http://www.marketresearchreports.biz/pressrelease/4269

The most prominent players in the global software testing market are Capgemini, Wipro, IBM, and Accenture. These players together dominate the overall market, rendering its vendor landscape a highly consolidated nature. These few established companies have been joining forces with smaller vendors so as to improve their ongoing innovations and enhance their software testing offerings. The aforementioned players also enjoy a strong hold over most regional markets. Other major companies operating in the software testing market are Atos, UST Global, Steria, Gallop Solutions, CSC, Cigniti Technologies, Tech Mahindra, Deloitte, NTT DATA, and Infosys.

Source: All the above are personal perspective on the basis of information provided by Latest Software Testing News & MarketResearchReports.biz

Source: http://www.latestsoftwaretestingnews.com/?p=8738

14th Annual IEP (Internationally Educated Professionals) Conference

February 27th, 2017 by blogadmin No comments »

The 14th Annual IEP (Internationally Educated Professionals) Conference is being held on March 3rd 2017 at the Metro Toronto Convention Center

Insight into IEP Conference

This is a unique event to connect and network between the prospective internationally educated professionals and the Canadian employers.

Every year Canada welcomes International professions to come and work and be part of this great successful nation. The IEP conference plays an influential part connecting the IEP’s with the employers who are looking for new resources to come and work for them.

 What? An Overview

The IEP Conference is a unique Canadian initiative for Internationally Educated Professionals (IEPs) taking interactive learning, economic development, and stakeholder awareness to unprecedented levels. Every participant can engage in a myriad of career and personal development experiences guided and encouraged every step of the way by subject matter experts from licensing bodies, educational institutions, professional associations, skilled facilitators and successful IEPs. Hundreds of previous attendees herald this free event for IEPs as a “not to be missed” initiative, offering an innovative, highly focused and productive platform for converting career dreams into practical action plans for rewarding employment.

Why? Canada’s Employment ‘Puzzle’

The IEP Conference is a unique forum offering unparalleled connection opportunities and much more! Many newcomers in search of meaningful careers find it difficult to access the Canadian job market. Possessing the necessary skills is just the first piece of the puzzle. Connecting with employers who are hiring, gaining Canadian work experience and adapting to Canadian business cultures are also essential ingredients towards securing a better job. The ultimate goal is to connect the skills, enthusiasm and potential of this unique labour pool with relevant stakeholders.

Who? Crossing the Bridge – From Connections to Action

In our 14th year, we anticipate attracting over 1,000 participants as well as industry representatives, government officials, regulatory/accreditation bodies, educational institutions and key associations.

Conference Highlights


  • Spotlight on five major sectors including Finance & Accounting; Information Technology; Healthcare and Related Professions; Engineering; Sales, Marketing & Communications.
  • Sector sessions featuring interactive, solutions oriented discussions and panels on accessing the Toronto job market

    Concurrent skill building sessions in sector hubs on job search strategies and practical tips, linking IEPs with employer needs

    Marketplace – the conference’s one stop venue for the latest on accreditation, certification, education, training, employment and employer information

    Directory of Resources – a unique, information-packed compendium, essential for every IEP – updated annually


PS: All the above are personal perspective on the basis of information provided by IEP.

Source: http://www.iep.ca/index.htm



SAP S/4HANA: 10 Questions Answered

February 2nd, 2017 by blogadmin No comments »

SAP S/4HANA: 10 Questions Answered
As they consider switching to the new SAP S/4HANA business suite, many customers are wondering how to make their move and when the right time will arrive.
SAP S/4HANA expert Michael Sokollek is here to provide the answers.
1. Why choose SAP HANA?
The main purpose of SAP HANA lies in executing all kinds of complex queries in the blink of an eye without aggregating (and thus sacrificing) data in advance. The in-memory technology at its core makes it possible to analyze even the largest data volumes in a matter of seconds. To facilitate real-time processing like this, SAP HANA takes advantage of the memory capacity and various caches provided by modern hardware. In fact, the technology has also influenced the development of new generations of CPUs.
This is how SAP HANA tackles virtually every challenge users can throw at it in terms of data availability, consistency, and integrity while meeting all the requirements of Acid3 testing (which determines how well applications conform to the standards of the World Wide Web Consortium, or W3C).
Additional resources:
SAP HANA website
SAP HANA Hardware Directory
Data center readiness
SAP HANA 2: The Next-Generation Platform for Digital Transformation
What is SAP HANA 2?
SAP HANA road map
2. What are the advantages of an in-memory platform?
In-memory databases make it possible to process even the largest masses of data in short order. This alone is a significant advantage over previous technologies. Meanwhile, in-memory technology also opens the door to further options in developing innovative applications that offer practical business utility, including:
• Integration of text, images, video, and various other data formats through virtual access (without having to import the data into SAP HANA), real-time transfer, time-dependent integration, or data imports from Hadoop clusters
• Application development directly on the platform without a separate application server
• Functions that provide a foundation for further data processing, displaying data on maps based on SAP HANA geospatial engine, for instance.
This enables customers to assemble application scenarios without additional hardware and software while avoiding data redundancy.
Additional resources:
• A platform for applications from both SAP and other providers
• Business transformation with SAP HANA
• IT Simplification with the SAP HANA Platform (brochure)
• IT Simplification with the SAP HANA Platform (infographic)
• SAP HANA Customer Impact Map, which offers more than 100 use cases from customer projects involving IT simplification, performance enhancement, and lowering TCO with SAP HANA
3. What is SAP S/4HANA?
SAP S/4HANA is a real-time ERP suite for digital business. It is based entirely on the in-memory SAP HANA platform, which facilitates a much simpler data model. Thanks to SAP Fiori, users enjoy the kind of personalized, role-based interface they’ve grown used to as consumers and increasingly expect in the business world, as well. Here, SAP S/4HANA avoids integration gaps in order to provide end-to-end process support. This means you no longer have to switch to a business warehouse to generate a report, for example, or complete steps manually outside of your SAP landscape.
While SAP S/4HANA has been built from the ground up, it essentially provides the same functional scope as SAP ERP and is based on that application’s data model, although in radically simplified form. As such, SAP S/4HANA is compatible with conventional SAP ERP systems and doesn’t require a full-scale (greenfield) reimplementation.
With SAP S/4HANA Finance, customers have already been simplifying their financial and accounting activities since 2014. SAP S/4HANA has been available as a standalone product since November 2015, and its latest version (1610) was released in late October 2016. SAP has also plotted out a three-year road map to solidify its plans regarding the suite’s ongoing development (see question 7).
4. Why move to SAP S/4HANA?
As the successor to SAP Business Suite, SAP S/4HANA makes in-memory technology accessible to previous users of SAP ERP. To take advantage of the technical possibilities afforded by the SAP HANA in-memory platform, SAP had to simplify the underlying data model and provide for new interfaces in order to streamline processes and ensure an intuitive user experience. Making adjustments of that scale to an already established application like SAP ERP would be simply impossible.
Meanwhile, SAP S/4HANA leverages not only simple interfaces based on streamlined data structures, but the capabilities of its underlying platform, as well – including full text search. This provides companies with a technical foundation for improving their current processes and defining new ones as a basis for new products, services, and brand-new business models.
“SAP S/4HANA makes in-memory technology accessible to SAP ERP users”
Technical advancements like these can both radically simplify and enhance existing processes, as the following example illustrates. Whether as a hobby or a profession, anyone who wanted to share their own photos in the 1980s and -90s will likely still recall the complicated process of inserting film, snapping shots, developing them, post-processing negatives, and sending out the printed photos by mail. It was an intricate and time-consuming affair to say the least, and the only way to find out whether the results matched your expectations was to see it all the way through.
Along with the Internet, modern smartphones and their built-in high-resolution cameras have changed all this not only by making the process faster, but also by reducing the entire underlying “infrastructure.” Furthermore, this example shows that speed alone isn’t enough: Unless the foundational architecture is simplified, the corresponding processes will also remain complex and unable to provide an environment in which creative solutions can be developed for the problems at hand.
Put another way, existing ERP solutions – including SAP ERP powered by SAP HANA – represent nothing more than the conventional way to share photos by mail. SAP S/4HANA, on the other hand, offers the chance to do more than just accelerate such functions; based on a new architecture, it provides for sweeping simplification and support for processes that were previously unavailable.
5. How does SAP S/4HANA simplify the previous SAP Business Suite?
SAP S/4HANA simplifies various areas of SAP Business Suite in its existing form.
User interface: Moving from the transactional SAP GUI to the role-based SAP Fiori has reduced the number of screens in the application, the need to switch between them, and the fields essential to each role and process.
Eliminating the separation between OLTP and OLAP functions has made complex ERP analyses possible again. These analyses can serve as starting points for transactions, including in material planning: If a given stock KPI falls below a certain threshold, for example, the system will report a shortage, enabling users to act immediately in response.
SAP Fiori supports virtually any device (including computers, tablets, and smartphones) in order to offer end users the same user-friendly interface and functions regardless of their preferred platform. Those familiar with the processes they need can essentially jump right into the software without any training.
Function: SAP follows the “principle of one,” which means that SAP S/4HANA’s ERP system (or an entire SAP system landscape) offers a single solution for a given business requirement. In the past, SAP ERP contained multiple transactions for one particular business process. A number of solutions previously found within SAP ERP, including in e-commerce, have also been replaced by SAP Hybris.
Data model: With SAP S/4HANA, aggregating data in established formats for every posting – that is, compiling information in a summarized form in order to produce more compact data packages – is no longer required. In the past, developers predefined aggregates to achieve more efficient data processing. The drawback in doing so, however, is that such aggregates generally pertain to a specific type of query. As a result, most new queries require a new type of aggregate. Foregoing data aggregation in favor of selecting individual items based on arbitrary criteria gives users a great deal of flexibility in generating analyses. It also enables SAP S/4HANA to attain higher throughput rates (read: more postings), as fewer tables need to be updated and no time is wasted waiting due to database locks.
Landscape: In its ongoing mission to simplify your system landscape, the latest release of SAP S/4HANA (1610) now includes SAP Extended Warehouse Management as an embedded EWM component. Release 1610 also incorporates the production planning (PP) and detailed scheduling (DS) components previously exclusive to SAP Advanced Planning and Optimization. This eliminates the need for the Core Interface (CIF) and reduces data redundancy, which in turn lowers operational and infrastructure costs.
6. What functions are available?
In release 1610, customers who already use SAP S/4HANA can access functions in the same manner and scope as in SAP ERP. SAP S/4HANA also offers an array of new user interfaces and capabilities that enable companies to significantly simplify their processes in various ways. Corresponding use cases for releases 1511 and 1610 describe examples of how to do so from both a technical and area-specific perspective.
The differences between SAP S/4HANA and SAP ERP have been documented in this Simplification List. Customer experiences have shown that only around 10 percent of these simplifications will be relevant to a given individual system. Meanwhile, functions that have not yet been overhauled are available in compatibility packs.
New customers can implement SAP S/4HANA based on SAP’s own best practices, which involves additional support through SAP Activate.
7. What future plans are in place for SAP S/4HANA?
New releases of SAP S/4HANA are scheduled to appear every year, with updates arriving on a quarterly basis. Each release will receive maintenance support for five years (until December 31, 2021 in the case of release 1610, for example).
Road maps covering a three-year period are also available for SAP S/4HANA. The current road map for the Q4 2016 edition can be found here (and all other road maps here).
8. What added value does SAP S/4HANA offer, and where can I see proof? Are there any use cases?
In essence, SAP S/4HANA features three main characteristics:
• A new design that focuses on users and their role-specific needs, can be accessed in-browser on any device, and supports communication among all process participants in a given system
• A modern architecture based on a simplified data model that makes it possible to map all of your processes in a single system – including those that were previously in place in separate installations
• An intelligent business approach that actively notifies users when certain parameters exceed or fall short of critical KPIs and offers forecasts based on predictive algorithms
At CeBIT 2016, Wieland Schreiner, executive vice president at SAP, explained how SAP S/4HANA is blazing a trail into the digital future. In a simple product demo (video in German), he illustrates how SAP S/4HANA makes it easier to render better decisions in inventory management. As a result, you’ll be able to minimize your security stocks and reduce capital tie-up. This is just one of many examples of the innovative potential companies like yours can take advantage of in SAP S/4HANA.
The following whitepapers explain how SAP S/4HANA addresses common pain points specific areas have faced in the conventional SAP Business Suite:
• Manufacturing whitepaper
• Supply chain whitepaper
• Sales whitepaper
• Research and development whitepaper
• Sourcing and procurement whitepaper
• Finance whitepaper
• Asset management whitepaper
9. How can I switch to SAP S/4HANA?
There are three ways to move to SAP S/4HANA:
• Fresh installation (greenfield approach): Implementation of a new system in line with processes based on SAP’s best practices, data migration from a previous system (SAP ERP or other), inclusion of select enhancements
• System conversion (brownfield approach): Transition to SAP S/4HANA in a manner similar to an upgrade or a migration; your configuration, data, and enhancements will remain as they are.
• Landscape transformation: Consolidation of an existing SAP ERP landscape, installation of a new SAP S/4HANA system, or conversion of an existing system
There are no differences between these approaches in terms of the options that used to be available in switching to previous versions of SAP Business Suite. SAP S/4HANA simplifies the established data model and features SAP Fiori, which provides users with a new, standardized interface on a wide variety of devices. At the same time, SAP also gives its customers the option to move to SAP S/4HANA by means of a system conversion. The data models of SAP ERP ECC and SAP S/4HANA are compatible, and SAP GUI will continue to be supported. The adjustments necessary before and during the transition to SAP S/4HANA are also documented in the Simplification List.
Even after completing a technical feasibility analysis, many customers find themselves wondering what their actual goals are in switching to SAP S/4HANA and whether they can achieve them in a system conversion. Answering these questions requires more than just an estimate of what a conversion would initially cost. It’s also important to assess whether it makes more sense to achieve the objectives at hand – making increased use of standard SAP offerings or improving data quality, for example – in a greenfield approach, or by carrying out a technical system conversion and implementing optimizations either simultaneously or after the fact. This is why the ultimate decision for or against a conversion always has to be made by each individual customer.
The fact that moving to SAP S/4HANA is different from a single self-contained project (like an upgrade or a database migration) is another key factor. An endeavor like this will involve your entire IT system landscape, and not because you’ll be upgrading other systems to newer releases or to a SAP HANA database at the same time. The reason is rather that the transition to SAP S/4HANA will simplify and clean up your IT landscape and all of the processes mapped within it, which requires a holistic perspective.
Additional Resources:
• SAP S/4HANA Value Assurance Map Program
• User Assistance for SAP S/4HANA 1610 (includes a conversion guide)
• Overview: Simplification List
• Simplification List (Excel)
• Top Simplification List Items
• Overview: System Conversion
• SAP S/4HANA 1610 – Business Functions
• Packaged services to aid switching to SAP S/4HANA
10. What are the actual steps I should take to approach this subject?
Every customer should take a look at the SAP HANA in-memory platform and making SAP S/4HANA the digital core of their company – and time is of the essence. This suite makes it possible to base an organization squarely on digital processes, which serve as an effective foundation for implementing and marketing compelling business scenarios. SAP S/4HANA is thus more than just a successor to SAP Business Suite; it represents an all-new product line.
For that reason, current investments and planned projects should be reviewed in terms of the strategic benefits they offer to a target SAP architecture with SAP S/4HANA at its core. Along with the necessary code adjustments and other technical issues, this should also involve analyzing established processes with an eye toward the performance characteristics of SAP S/4HANA; here, the use cases described under question 8 can provide inspiration.
PS: All the above are personal perspective on the basis of exposure to information provided by SAP.

Source : https://news.sap.com/sap-s4hana-10-questions-answered/

What are the different levels of SAP certification for FICO and the S4 HANA in simple finance?

January 27th, 2017 by blogadmin No comments »

As you know S/4 HANA is the business application, which is replacing ECC. At this time S/4 HANA 1610 is available in the market.

As mentioned by the previous author, there are generally 2 levels of certifications: Associate and Professional.

At this time, S/4 HANA Finance has following certifications:

  1. Associate Level

Certification Code: C_TS4FI_1511

SAP Certified Application Associate – SAP S/4HANA for Financial Accounting Associates (SAP S/4HANA 1511)

SAP Certified Application Associate – SAP S/4HANA for Financial Accounting Associates (SAP S/4HANA 1511)

  1. Professional Level


SAP Certified Application Professional – Financials in SAP S/4HANA for SAP ERP Finance Experts

SAP Certified Application Professional – Financials in SAP S/4HANA for SAP ERP Finance Experts


#S4HANA customer series: Multinational Robotics Firm in Asia Reimagines Business with SAP S/4HANA

December 28th, 2016 by blogadmin No comments »

Since the launch of SAP S/4HANA, the momentum around it has grown as companies operating in diverse industries have recognized the advantages of having a digital core to ignite digital transformation. In this series we will show the value S/4HANA can bring with the help of examples businesses who already took to road to SAP S/4HANA.

A multinational robotics firm that designs manufactures and deploys advanced robotics systems for automation at distribution centers and fulfillment centres, chose SAP S/4HANA to enhance its digital enterprise journey. This ‘Top Innovator’ award-winning enterprise has its manufacturing base in Asia with offices spread across multiple countries. This automation company with its portfolio of material handling automation and technology products has an impressive market share of Asia’s warehouse automation market.

Being a net new SAP customer, their adoption path of SAP S/4HANA was similar to that of a Green Field implementation. The customer was keen to leverage the “end to end” digital capabilities of SAP S/4HANA and went live in about 5 months’ time with the following functionalities:

  • Finance (FIN): Core Finance, Asset Accounting, Controlling, Credit Management, India Localization, Accounts payable and Accounts receivable
  • Sales and Distribution (SD): Output Management with adobe forms (with BRF+), Pricing, Sale from Stock, MTO and MTS, Buy and Sale process for stock transfer, Customer Return, Service Sales, Consignment and Third party sales
  • Material Management (MM): Output Management with adobe forms (with BRF+), Procurement process for Direct materials, subcontracting, Domestic & Import Procurement, Service Procurement, Consumables, Inventory Management
  • Production Planning (PP): Production Orders (Discrete), MRP, Variant Configuration, Product Costing
  • Quality Management (QM): Basic QM processes with Inspection lots (Incoming Inspection)

 Achieved Business Value

With SAP S/4HANA, the customer achieved significant business value in their Finance and Logistics functions. To start with, there were significant improvements in the fulfillment of sales orders and delivery processing. By implementing the Material Ledger (ML) functionality with SAP S/4HANA, they were able to achieve enhanced operational efficiency in product costing and inventory valuation. As ML allows valuation in two additional currencies, this global organization operating in several countries can now valuate inventory in multiple currencies.

Deployment of SAP Fiori apps not only improved efficiency, but also enabled a real-time insight into business KPI’s, which ensured a better decision making process. It also improved the user productivity by simplifying and automating day-to-day tasks. The use of BRF+ reduced the coding effort around scenarios of Output Management, which brought about simplicity as well. And last but not least, the implementation of the localization functionality in SAP S/4HANA 1511 made it easier to manage the legal requirements.

The Way Forward Looks Promising

With the SAP S/4HANA solution, the award-winning robotics firm now has real-time visibility into its operations, has the ability to fast track its international expansion and is able to focus on its core objective which includes solving operational inefficiencies and manpower issues at distribution centers and warehouses.

This is only one of many examples that proves SAP S/4HANA delivers the value needed to lead your business through the digital transformation. If you enjoy this kind of content, please let us know by liking this blog. Find more success stories here and stay tuned for more parts of this #S4HANA customer series.

Source: All the above are personal perspective on the basis of information provided by SAP on S4HANA


18 Free Exploratory Data Analysis Tools for People who don’t code so well

November 30th, 2016 by blogadmin No comments »

All of us are born with special talents. It’s just a matter of time until we discover it and start believing in ourselves.

Some people struggle when they start coding in R. Sometimes a lot more can be done than one can ever think! Some people have never ever coded, not even <Hello World> in their entire life.  Below are the several non-coding tools available for data analysis.

List of Non Programming Tools

1. Excel / Spreadsheet

If anyone is transitioning into data science or have already survived for years, they would know, excel remains an indispensable part of analytics industry. Even today, most of the problems faced in analytics projects are solved using this software. It supports all the important features like summarizing data, visualizing data, data wrangling etc. which are powerful enough to inspect data from all possible angles. No matter how many tools a person knows, excel must feature in their armory. Though, Microsoft excel is paid but they can still try various other spreadsheet tools like open office, google docs, which are certainly worth a try!

  1. Trifacta

Trifacta’s Wrangler tool is challenging the traditional methods of data cleaning and manipulation. Since, excel possess limitations on data size, this tool has no such boundaries and everyone can securely work on big data sets. This tool has incredible features such as chart recommendations, inbuilt algorithms, analysis insights using which anyone can generate reports in no time. It’s an intelligent tool focused on solving business problems faster, thereby allowing us to be more productive at data related exercises.

  1. Rapid Miner  

This tool emerged as a leader in 2016 Gartner Magic Quadrant for Advanced Analytics. It’s more than a data cleaning tool. It extends its expertise in building machine learning models. It comprises all the ML algorithms which used frequently. Not just a GUI, it also extends support to people using Python & R for model building. In short, it’s a complete tool for any business which requires performing all tasks from data loading to model deployment.

  1. Rattle GUI

If anyone has tried using R, but couldn’t get a knack of what’s going in, Rattle should be their first choice. This GUI is built on R and gets launched by typing install.packages(“rattle”) followed by library(rattle) then rattle() in R. Therefore, to use rattle it’s a must to install R. It’s also more than just data mining tool. Rattle supports various ML algorithms such as Tree, SVM, Boosting, Neural Net, Survival, Linear models etc.

  1. Qlikview

Qlikview is one of the most popular tools in business intelligence industry around the world. This tool derives business insights and presents it in an awesome manner. With its art visualization capabilities, it gives tremendous amount of control while working on data. It has an inbuilt recommendation engine updated from time to time about best visualization methods while working on data sets.

  1. Weka  

An advantage of using Weka is that it is easy to learn. Being a machine learning tool, its interface is intuitive enough to get the job done quickly. It provides options for data pre-processing, classification, regression, clustering, association rules and visualization. Most of the steps while model building can be achieved using Weka. It’s built on Java.


Similar to RapidMiner, KNIME offers an open source analytics platform for analyzing data, which can later be deployed, scaled using other supportive KNIME products. This tool has abundance of features on data blending, visualization and advanced machine learning algorithms. By using this tool one can build models also.

  1. Orange  

As cool as its sounds, this tool is designed to produce interactive data visualizations and data mining tasks. There are enough youtube tutorial to learn this tool. It has an extensive library of data mining tasks which includes all classification, regression, clustering methods. Along with, the versatile visualizations which get formed during data analysis allowing to understand the data more closely.

  1. Tableau Public

Tableau is a data visualization software. We can say, tableau and qlikview are the most powerful sharks in business intelligence ocean. The comparison of superiority is never ending. It’s a fast visualization software which allows exploring of data, every observation using various possible charts. It’s intelligent algorithms figure out by self about the type of data, best method available etc. For understanding data in real time, tableau can get the job done. In a way, tableau imparts a colorful life to data and allows sharing work with others.

  1. Data Wrapper  

It’s a lightning fast visualization software. When someone gets assigned BI work, and the person has no clue what to do, this software is a considerable option. It’s visualization bucket comprises of line chart, bar chart, column chart, pie chart, stacked bar chart and maps. So, it’s a basic software and can’t be compared with giants like tableau and qlikview. This tools is browser enabled and doesn’t require any software installation.

  1. Data Science Studio (DSS)

It is a powerful tool designed to connect technology, business and data. It is available in two segments: Coding & Non-Coding. It’s a complete package for any organization which aims to develop, build, deploy and scale models on network. DSS is also powerful enough to create smart data applications to solve real world problems. It comprises of features which facilitates team integration on projects. Among all features, the most interesting part is that work can be reproduced in DSS as every action in the system is versioned through an integrated GIT repository.

12. OpenRefine

It started as Google Refine but looks like google plummeted this project due to unclear reasons. However, this tool is still available renamed as Open Refine. Among the generous list of open source tools, openrefine specializes in data cleaning, transforming and shaping it for predictive modeling purposes. As an interesting fact, during model building, 80% time of an analyst is spent in data cleaning. Not so pleasant, but it’s the fact. Using openrefine, analysts can not only save their time, but also put it to use for productive work.

  1. Talend

Decision making these days is largely driven by data. Managers & professionals no longer make gut-based decision. They require a tool which can help them quickly. Talend can help them to explore data and support their decision making. Precisely, it’s a data collaboration tool capable of clean, transform and visualize data. Moreover, it also offers an interesting automation feature where a person can save and redo their previous task on a new data set. This feature is unique and hasn’t been found in many tools. Also, it makes auto discovery, provides smart suggestion to the user for enhanced data analysis.

  1. Data Preparator  

This tool is built on Java to assist in data exploration, cleaning and analysis. It includes various inbuilt packages for discretization, numeration, scaling, attribute selection, missing values, outliers, statistics, visualization, balancing, sampling, row selection, and several other tasks. It’s GUI is intuitive and simple to understand. Once someone starts working on it, it wouldn’t take lot of time to figure out how to work. A unique advantage of this tool is, the data set used for analysis doesn’t get stored in computer memory. This means it’s possible to work on large data sets without having any speed or memory troubles.

  1. DataCracker  

It’s a data analysis software which specializes on survey data. Many companies do surveys but they struggle to analyze it statistically. Survey data’s are never clean. It comprises of lot of missing & inappropriate value. This tool reduces agony and enhances experience of working on messy data. This tool is designed such that it can load data from all major internet survey programs like survey monkey, survey gizmo etc.

16. Data Applied

This powerful interactive tool is designed to build, share, design data analysis reports. Creating visualization on large data sets can sometimes be troublesome. But this tool is robust in visualizing large amounts of data using tree maps. Like all other tools above, it has feature for data transformation, statistical analysis, detecting anomalies etc.

17.  Tanagra Project

This tool is old fashioned UI, but this free data mining software is designed to build machine learning models. Tanagra project started as free software for academic and research purposes. Being an open source project, it provides enough space to devise its own algorithm and contribute.

  1. H2o

H2o is one of the most popular software in analytics industry today. In few years, this organization has succeeded in evangelizing the analytics community around the world. With this open source software, they bring lighting fast analytics experience, which is further extended using API for programming languages. Not just data analysis, but allows for building advanced machine learning models in no time.

Bonus Additions:

In addition to the awesome tools above, below are some more tools which might be interesting to look at. However, these tools aren’t free but available for trial:

  1. Data Kleenr
  2. Data Ladder
  3. Data Cleaner
  4. WinPure

End Notes

Once a person starts working on these tools they would understand that knowing programming for predictive modeling isn’t much advantageous. They can accomplish the same thing with these open source tools. Therefore, until now, if anyone was disappointed at their lack of non-coding, now is the time you channelize their enthusiasm on these tools.

The only limitation with these tools (some of them) is, lack of community support. Except few tools, several of them don’t have a community to seek help and suggestions. Still, it’s worth a try!

PS: All the above are personal perspective on the basis of exposure to information provided by Analytics Vidhya

Source : https://www.analyticsvidhya.com/blog/2016/09/18-free-exploratory-data-analysis-tools-for-people-who-dont-code-so-well/


November 12th, 2016 by blogadmin No comments »


The Analytics & Big Data sector has been consistently growing in the last five years despite an increasingly volatile and undetermined global outlook. Despite of this outlook the analytics and Big Data market is expected to grow in the overall IT markets. Here we assess the global scenario reasons for the growth, individual returns in terms of salary and what impact Analytics and Big Data has in todays economy.

Impact on businesses

Analytics & Big Data have revolutionized the way business is done around the world. All companies, small sized or fortune 500, rely on data and analytics to make critical business decisions. From understanding consumer behavior to predicting market trends, even right down to product features, many moves are driven by analytics and data in companies across the world.

In today’s global world Big Data and Analytics is used in Entertainment, Education, Transportation, Government, Defense, Retail, Health care, Finance.

For Example, Amazon is one of the leading consumer companies in the world using analytics and Big Data to configure their products, services and delivery. Amazon uses analytics to suggest products on their customer homepage based on the customer’s previous purchase history and browsing habits. They analyse the customer mindset based on the sites frequently visited and the products purchased from other sites.

Twitter uses analytics to fill your news feed with updates from people you interact with the most. Flipkart and Snapdeal use Predictive Analytics. The Postal Service invests in gathering and analyzing data to improve last mile delivery operations

Hiring trends

Earlier people used to be specialized in just one tool or domain but not anymore. Just getting a Master’s degree or a MBA on a resume does no longer impress hiring companies.  Current day Companies are investing in employees who know how to use the entire tool set of analytics & Big Data. This year people who know R and Python command a premium. Companies are looking for Data Scientists. They want people who have the business knowledge and understanding of analytics. The person would need to know the market analysis and business knowledge of similar industries to get that job.

R and Python is the front-runner in the analytics race. If the skill set is more diverse and business knowledge oriented the more the person earns.

Strategists and Analysts skilled in both Big Data and Data Science are being snapped up at the highest salaries. Cash-strapped startups spend money on their star analysts but when it comes to tools, they prefer to use open-source ones like R.

Analysts can expect a steep increase in their salaries once they cross the 5-year mark.

Big Data analysts are on a better earning foothold. Big Data professionals earn more than data scientists, but at the same time if they are combined together and know how to work with both they would get a larger payout – Data scientist + Big Data Analyst.

Earnings and skill requirement as per Companies Size

Startup companies and mid-size companies need people who know R and they are willing to pay top dollar for it. R is in great demand across the board. But, if a person wants to join a large company, they would need to add SAS to their skills. This is due to the fact that larger companies can afford to pay for proprietary software’s like SAS which may not be available with smaller companies. The biggest jump in salaries is seen after the 5 year period, where analysts can expect up to a 70% raise with an average pay.


Guidelines for Anyone Considering a Career in Big Data Analytics


Data scientist is person with an analytical mindset. Data analysts have an inquisitive mind and enjoy quiz and solving complex puzzles. They also spend time on analyzing numbers, inspecting huge financial data to see if they can perceive any meaningful patterns or tell any discrepancies.

Pursue College Degree or enroll in Institutes offering Big Data Programs

After self-assessment one can analyse if they want to go to college get a degree in data science, or if doing an intensive certification course is more beneficial. The person can do a research to figure out which universities or institutes offer the courses or programs that suit their profile.

Familiarizing oneself with the data analytics landscape

Analytics comprises of various techniques and tools that can be utilized in different variations for the purpose of diversity in business or healthcare management. The person wishing to pursue data analytics must analyse the intended work ‘domain’ in order to decide the kind of courses they want to take. Keep in mind that some software skills are in greater demand in some professional sectors than others.

In-Demand Data Analytics Skills

While data analytics skills are in great demand these days, some skills are more in demand than others. A person’s hands-on experience with different kinds of software will help command better salaries than expertise in just one.

Big Data

Popular Big Data-specific skills include statistics, programming, and mathematical modelling. A combined knowledge of R and Python, can equip the person with these skills.


Sometimes referred to as ‘a hyperactive version of excel’, R is used by organizations as varied as Facebook, Google and leading news agencies. It’s used to sift through large data sets, that it can then easily ‘manipulate’ using modelling techniques and powerful data visualization tools.


Python is a versatile, open-source programming language and framework. It is fairly easy to learn and pick up Python’s framework. It can be used to create web apps and also perform analytics Python is leading as one of the most popular coding language in the world. It was developed by Guido Van Rossum in the mid-90s.


Like Big Data, Hadoop is increasingly being referenced in job advertisements. Due to large capacity Hadoop computes with Big Data on a large scale. Hadoop’s growing demand and appeal shows no sign of decreasing in the coming years.

Following are the industries driving the growing demand for data science skills. This growing trend is expected to keep increasing in the coming years.

  • Scientific and Health care Services
  • Technical Services
  • Information Technologies
  • Manufacturing
  • Finance and Insurance
  • Retail
  • Government