Data flow in Salesforce Einstein Analytics

Prasanna
6 min readMay 6, 2020

Data flow JSON is a set of instructions in a JSON file. This is for loading data from multiple Salesforce training objects or existing datasets. Besides these data is from external sources. Therefore, it converts them into a new data set in Einstein Analytics.

Dataflow process

Data flow Builder modification in Einstein Analytics involves five steps.

  • You need to design a data flow template. Therefore defining data transformations.
  • Use external data in a data set. Besides, you can call External data in data flow “Edgemart.”
  • Customize the definition file with either the data set creator or JSON. Therefore it manually uploads the definition file.
  • The next step is to start data flow, track data flow. Besides, you must troubleshoot if you notice data flow errors.
  • Then the last move is to plan regular runtime data flow.

Now let us discuss in depth each step of the data flow process. This figure shows Data flow Operation.

Data flow Design

You need to generate a data flow description file, which is a JSON file first. Consider what data for queries should be available, where to extract the data. Besides if you need to change the extracted data to prepare your final data set. Then to explain important decisions, let’s analyze one case. The aim is to merge Salesforce data with external data. Thus, the data set will contain data viz. Salesforce, Incentive, Account, and Users entering any external data file (CSV format).

Data flow architecture

Data flow architecture to merge Salesforce Data with External Data by JSON. It collects opportunity, account, and user data from Opportunity, Account, and User objects. Besides, creates a new data set called “sfdc Digest” for each extracted object. You can extract External data via external connectors to create a new data set. You can call it Edgemart.

Data flow Builder transforms the datasets created from internal and external extracted data. Then the data flow specifically combines ability, account, and customer data into a new data set. This is through a process “Augment.” First, data flow improves Edgemart with the Salesforce Augment data set. Finally, the end-users can access the add-on. You configure the data flow to register the final data set.

Transformation Styles

Data set-based transformation is data manipulation. Software Builder provides several different manipulations to transform the data set, includes.

  • Sfdc Digest Transformation: This function collects software from objects in Salesforce.
  • Digest Transformation: This integration combines data from various connections to create a new data set.
  • Edgemart Transformation: Extracts data from external sources (like CSV files).
  • Append Transformation: This transformation incorporates information from different datasets into a single data set. However, it does not delete redundant information.
  • Augment Transformation: This transformation increases or joins the extracted input data set. Then combines columns from other similar datasets using a specific ID.
  • Compute Transformation of Expression: This transformation allows you to add derived fields to a data set. Besides, you can produce the calculated field values by values using a SAQL query. Then you can extract from one or more input data fields.
  • Compute Relative Transformation: Use this transformation to determine data patterns. Then you can apply fields to a data set on values in other rows.
  • Dim 2 mea Transformation: It creates a new dimension-based metric.
  • Flat Transformation: This transformation flattened data on hierarchical positions. For example, you can flatten Salesforce position hierarchy. You can only use this method for datasets. Therefore, you need to apply for roll-based row-level protection.
  • Data flow Filter Transformation: You can use this transformation to test different criteria. Thus, this role determines the condition identifying which record to maintain in the data set.
  • Slice Data set Transformation: This process eliminates fields from a data set. Therefore leaves a new data set of fields for analysis.
  • Update Transformation: This function changes the specified field values in an existing data set. Then you may base it on the data from another dataset the “lookup dataset.”
  • The transformation looks up new values from matching fields in the lookup data set. Then transformation function update stores information in a new data set.
  • Sad Register Transformation: This feature registers the data set. Therefore, this makes it usable for queries to create lenses and dashboards.

Transformations in Data flow Builder

Configure Data flow based on business requirements

After data flow design, the next step is to set data flow. Besides, you can customize the data flow to extract data, convert business-based datasets. Then register the datasets you want to use for queries. To customize the data flow, apply transformations to the JSON script.

Start, track and reschedule data flow

After building the data flow, you need to start a data flow job manually to load data into datasets. . Data flow Builder has a limit. Besides, you can run 24 data flow jobs in 24 hours. However, if you run the data flow job in 2 minutes, then it is not countable. After the initial data flow job runs, you can schedule regular running. You can schedule it at a particular time. Scheduling option detects modified Salesforce info. Therefore, you can always interrupt, reschedule and track a data flow job to solve problems.

The next important step after transformation is to import data into Einstein Analytics. Data integration approaches differ:

  1. User Interface:

From the User Interface Analytics tab, click Build button. Then choose CSV from the drop-down menu. Figure 4 displays the user interface screenshot to import the CSV file. Click Section 1 to add external CSV info. After uploading an external file, the JSON file generates. You can use the file to alter optional metadata. You can upload the modified JSON file in Section 3. Click the Preview Data button (Section 4) to preview data.

  1. Wave Connector:

To upload Excel data, use the Wave Connector software to import data. You can import from Microsoft Excel 2013 to Einstein Analytics. Wave Connector is available as Microsoft Store program. After installation, click Wave Connector to import Excel to Salesforce files.

  1. Data set Builder:

To mount Wave Connector, open Microsoft Excel 2013 (or later), press Insert tab. Then go to the Office App Store. Scan and click to mount Wave Connector. After installation, you should attach your Salesforce account to start uploading the data.

Click on Build and pick Data set from the Analytics Studio homepage. See various options, select Salesforce. Choose a Salesforce object to continue. You can call it an entity “grain,” with which we establish the relationship. Next press the Plus button, picks correct areas. Click the Relationship tab to add more Salesforce objects. Then pick another Salesforce object, and press Enter. Therefore, you turn Salesforce data into Salesforce Einstein Analytics online training.

  1. External API:

If you want your own connector to load CSV or Zip files into Einstein Analytics, use the External Data API. You can see External Data API architecture.

  1. Connectors

Einstein Analytics offers multiple connectors:

  • Salesforce local connector replicates additional artifacts without having to touch the dataflow. Salesforce Multi-org Connector links to an external board to link Einstein analytics.
  • Marketing Cloud Connector allows marketing campaign data access.
  • Heroku Postgres transfers data from Heroku Postgres to Einstein Analytics.
  • Amazon Redshift adapter transfers Amazon Redshift raw data. This is in an aggregated form to Einstein Analytics.
  • Microsoft Dynamics CRM Connector will replicate lens and dashboard data. It replicates data from Microsoft Dynamics 365 to Einstein Analytics certification.
  • Google BigQuery Connector is a connector in the Einstein Analytics Winter 18 release. This is to get data from Google Analytics. Besides, this can be useful for marketing teams. Therefore, check website content and online interaction. Now, Einstein Analytics will replicate this data using lenses and dashboards.

Conclusion:

This article covers the Data flow Builder’s skills in transforming data to construct different dashboards. It also covers various data integration methods to import data provided by Einstein Analytics. You can learn more through Salesforce Einstein training.

--

--