Hands On Training
Pentaho Course Overview
Our platform welcomes you to start your Business Intelligence and Analytics career with our Pentaho Training. This training is delivered by experienced Pentaho professionals with many years of domain expertise. It will cover all the essentials of the robust Pentaho platform, from basics to advanced concepts. Also, our training includes hands-on practical skills in this BI platform with projects and expert support till the end.
To apply for the Pentaho Training, you need to either:
- You need to have good knowledge on databases like SQL for retrieving and manipulating the data.
- You should have a good understanding on data modelling techniques and data warehousing concepts like data integration and ETL
pentaho Course Content
We have developed the latest Pentaho BI course content based on deep research and industry experts’ advice. This Pentaho Training program ensures learners have the right skills to meet the current industry standards. You can go through the following course modules of Pentaho BI to get an overall idea.
In this section, you will be introduced to the basic concepts of Pentaho BI to make you fundamentally strong.
- Pentaho Overview
- Introduction to Pentaho business intelligence and analytics
- Understanding dimension tables and fact tables
- Database dimensional modelling
- Star schema usage for massive data sets querying
- Snowflake Schema
- Slow changing dimensions principles
- Pentaho artifacts knowledge
- Managing High availability support for DI and BA servers
Learning Outcome: By the end of this chapter you will gain complete overview of topics such as database dimensional modelling, Schema query for large data sets, Pentaho artefacts, user console etc.
This section has been designed to teach you the concepts of data and how data is being used in Pentaho for generating business insights.
- Data models designing for reporting
- Design a Streamlined Data Refinery
- Using Pentaho for predictive analytics
Learning outcome: By the end of this chapter, you will gain hands-on expertise in performing predictive modelling, data modelling, and develop a data refinery solution for a duplicate client.
Clustering is used for statistical analysis in many fields. In Pentaho, multiple nodes join together to make a cluster.
- Basic data integrations in Pentaho
- Move CSV file input to Microsoft Excel output and table output.
- Data transfer from excel to data log and grid.
Learning outcome: Upon the completion of this chapter you will gain hands-on experience on the above topics.
A transformation is defined as a network of logical steps. When a transformation is created that reads a flat-file, sorts it, filters it, and moves it to a relational database table. In this section, you will be learning all the concepts related to data transformation.
- Data integration steps in Pentaho
- Understanding calculator
- Adding sequenceString replace
- Pentaho number range
- Selecting field value
- String operation
- Sorting and splitting rows
- The process to use metadata injection
- Unique row and value mapper
Learning outcome: Upon the completion of this chapter you will gain an overall knowledge of the above-mentioned concepts and data transformation process in Pentaho.
This section has been designed to provide you with the overall knowledge of Pentaho Flow and its associated concepts.
- Introduction to secure socket command,
- Pentaho mail,
- Pentaho error handling and null value
- Priorities stream and row filter.
Learning outcome: you will gain practical exposure on concepts like Handle null values in the data, working with socket command, error handling and gain end to end knowledge of Pentaho flow.
This section covers overall concepts associated with the deployment SCD in Pentaho.
- Knowledge of slow-changing dimensions Dynamic transformation
- Making ETL dynamic
- Creating folders
- File management
- Bulk loading
- Pentaho file transfer process
- Utility and File encryption
- Repository, and XML
Learning outcome: Upon the completion of this Deploying SCD concept you will gain hands-on exposure in all the concepts associated with and their contribution in Pentaho BI.
Pentaho is an environment for ETL development and collaborative analysis. Pentaho Repository is a medium using which Pentaho shares folders and files across products and teams. It also enables you to track the modification and revert previous files by providing a version history. This section will make you knowledgeable in Pentaho repositories concept.
- Creating dynamic ETL
- Parameter deployment with a transformation
- Sending values and variable from job to transformation
- Database connection
- Role of Repository in Pentaho
- Repository import and environmental variable.
Learning outcome: By the completion of this chapter, you will gain a complete understanding of the Repository concept and its functionality in Pentaho.
The Pentaho Report Designer is one of the business intelligence tools to design complex reports. Pentaho Report Designer is suites better when the reporting is complex and requires cross linkings. This section is designed to provide you with the complete knowledge of report designer and repository concepts.
- Working with Pentaho report and dashboard
- Designing a report
- Effect of row bending
- Working with Pentaho Server
- Pie and bar chart in Pentaho
- Creation of line
- Achieving localization in reports.
Learning Outcome: By the completion of this chapter you will gain an overall working knowledge of Pentaho repositories and designing various reports in Pentaho.
Pentaho dashboard is an information management tool that is used to track departmental, individual, or enterprise performance and provides critical information that is essential for taking crucial decisions. In this section, you will be learning dashboard concepts in Pentaho and its working process.
- Introduction to Pentaho dashboard concept
- Drill-down of Report
- Passing parameters in dashboard and report
- Working with Excel sheets
- Cubes deployment for report creation
- The data integration process in Pentaho for developing reports
Learning Outcome: By the completion of this chapter you will gain hands-on- expertise in the above-mentioned Pentaho concepts.
A cube is a collection of dimensional models centred on a fact table. In this section you will learn the cubes and its associative concepts.
- What is a cube?
- Developing a Cube
- Benefits of Cube
- Working with cube
- Reports and dashboard creation with cube
Learning Outcome: Upon the completion of this module you will gain practical exposure in developing reports and dashboards using Cube.
The Multi-Dimensional Expression (MDX) is an OLAP database query similar to SQL and also a calculation language. The MDX Language is used to retrieve the database from an OLAP database. In this section, you will learn all the concepts associated with MDX.
- Introduction to Multidimensional Expression (MDX)
- Understanding Tuple
- MDX sets, members, level
- MDX implicit dimensions
- Hierarchical navigation
- Dimensions referencing and metadata.
Learning Outcome: Upon the completion of this chapter you will gain hands-on exposure in Work with MDX, dimensions referencing, MDX sets, level, members, and other concepts of MDX.
Pentaho Analyzer helps the users to analyse reports and create visualizations from the data. The analyzer provides you with an easy to use environment where you can prepare reports and visualization using drag and drop options and explore data deeply. In this chapter, you will learn Analyzer and its associated concepts.
- Pentaho analytics findings
- Advanced analytics for visualizing data
- Blending various data sizes and types
- Extending Analyzer functionality,
- Pentaho REST APIs
- Embedding BA server reports,
Learning Outcome: After completion of this chapter you will be learning complete knowledge of Pentaho Analyzer and its associative concepts.
Pentaho Data integration provides the ETL capabilities to extract cleanse and store data using a unique format and can be accessed easily.
- Overview of PDI
- Steps to be followed for creating an ETL job
- The process to use property files
- Steps required to create an ETL transformation
Learning Outcome: You will gain practical knowledge of developing ETL transformation with PDI and usage process of property files. In this section you will gain complete knowledge of PDI development concepts.
Pentaho can be integrated with Hadoop at many levels such as Traditional ETL, Data Orchestration, Pentaho MapReduce for data transformations. In this section you will be learning how ETL connectivity works in Hadoop.
- Implementing ETL to work on Hadoop Ecosystem
- Hadoop Integration
- Transferring data from local files to distributed file system
- Designing MapReduce jobs,
- Deploying Apache Hive
- Complete Hadoop integration using ETL.
Learning Outcome: Upon the completion of this module you will learn how to complete the process associated with Hadoop integration process in Pentaho.
Dashboards simplify the decision-making process by presenting the complex business data in an easily understandable way. In this section, you will gain overall knowledge of dashboard concepts.
- Developing creative and easy to understand dashboards to gain business insights and to improve business performance.
Learning Outcome: You will gain practical knowledge in creating highly interactive dashboards for visualizing data.
In Pentaho, performance tuning helps in finding, locating and remedy bottlenecks in system performance. Performance tuning is a way to respond to issues that arise during production. In this section you will learn how Performance tuning works in Pentaho.
- Managing BA server logging,Monitoring performance of a jobTuning Pentaho reportsAuditing in Pentaho
Learning outcome: you will gain hands-on expertise in Fine-tune Pentaho report, and monitoring logging in BA server.
As data has to be integrated with various external as well as internal parties to analyze. In such cases, security issues prevail. In this section, you will be learning how to build security concerns in Pentaho.
- Ensuring Security levels in external Integration
- Data security
- Extending BA server security
- Using Kerberos with Pentaho
- Pentaho multi-tenancy support
Learning Outcome: Upon the completion of this section you will master the process to configure high-level security.
We at HKR not only provide you with theoretical training but also make you practically knowledgeable by making you work with real-.....world projects and case studies. Every course we offer includes two real-time projects which provide you with real-time experience. The practical knowledge improves your domain expertise and helps you in clearing the certifications with ease. Read more
Pentaho Training Reviews
Technical Lead - Service Now
Pentaho Training Objectives
Pentaho is a business intelligence software that specializes in providing various data-related services such as data integration, reporting, OLAP services, data mining and extract, information dashboards, and ETL capabilities.
This HKR training’s Pentaho training course is designed to make you skilled in all the areas of Pentaho which include business intelligence, Pentaho data integration, dashboards, Pentaho Reporting, and Mondrian cubes. Pentaho is an Open Source BI suite that integrates large data sets with Hadoop and allows reporting and analysis on top of it.
- Pentaho BI suite Architecture
- Pentaho Analytics to develop reports
- Performing various data integration, advanced analytics and transformations
- Pentaho business analytics and dashboards
- Using ETL design patterns for Star Schema.
- Developing complex dashboards and reports for analysis
- Using Pentaho workbench for developing Mondrian Cube
- Integrating Pentaho with Big Data and MapReduce
- Pentaho Kettle for automatic building and deploying reports
- Performance tuning and transformations.
Below mentioned job roles get benefited from learning this course:
- BI Developers
- Data Warehousing Programmers
- Business Analysts
- Solution Architects
- Mainframe and Testing Professionals
- Candidates who are aspiring to start their career in Data analytics and BI
There are no specific qualifications required to learn this course anyone can opt for this training.
- Poor data quality costs $600 billion loss for US business every year - TDWI
- The Big data and Data Analytics market is expected to witness a massive growth and reach $40.6 Billion in the next 4 years - ResearchandMarkets
- According to Indeed.com, the average salary received by a Pentaho Professional in the US amounts to $104,000
- Increased demand for the data has created huge employment opportunities in diversified fields and this number is growing constantly.
Pentaho Training FAQ's
Each and every class is recorded so if you missed any class you can review the recordings and clarify any doubts with the trainer in next class.
Yes, we don’t assure 100% placement assistance. We are tied up with some corporate companies so when they have a requirement we send your profiles to them.
Yes, we provide demo before starting any training in which you can clear all your doubts before starting training.
Our trainers are real time experts who are presently working on particular platform on which they are providing training.
You can call our customer care 24/7
Max of the students get satisfied with our training, if you are not then we provide a specialised training in return.