By: Tejender Mohan
Spoiler Alert!
If you are visiting this blog just to get the code to copy-paste, you should go do some more Google searches!
In this blog, we will discuss the implementation process and logic to build a lightning component that reads a CSV file and creates/updates records on a custom object based on CSV columns.
So Let’s Start Baking our Lightning Cake Component that Reads CSV Files…
Building a Lightning component is like preparing your favorite cake recipe. All you need is:
- Ingredients (Metadata)
- Well-defined steps to prepare the mix (Building Algo/Code)
- Baking Oven (Salesforce platform to run the code)
- Recipe Tests (Dev Tests)
- And don’t forget icing on the cake (The FrontEnd GUI)
Step 1 : Ingredients (Metadata)
- An accessible Object (custom or standard) with fields to store CSV column data
- In our example, we will use the below API names:
- Custom_Object__c
- Id
- Field_1__c
- Field_2__c
- Field_3__c
- Field_4__c
- In our example, we will use the below API names:
- A CSV file with static Column Names that users can use to add the records as rows. To update existing records, a column must be there to define a unique key.
- In our example, we will use the below column names:
- Column_0 (Id : Unique key)
- Column_1
- Column_2
- Column_3
- Column_4
- In our example, we will use the below column names:
- A Custom metadata to store mapping of CSV column names and Salesforce Fields.
- In our example, we will use the below API names:
- CSV_to_Salesforce_Field_Map__mdt
- CSV_Field_Name__c
- Salesforce_Field_Name__c
- Active__c
- Create the active Custom Metadata Records that define the mapping of CSV columns to Salesforce Fields.
- In our example,
- Column_0 → Id
- Column_1 → Field_1__c
- Column_2 → Field_2__c
- Column_3 → Field_3__c
- Column_4 → Field_4__c
- In our example,
- In our example, we will use the below API names:
- A lightning component Bundle with Apex Controller
Step 2 : The Recipe (The Code Logic)
- The First Step is to allow users to upload a file.
- As the requirement is about uploading and reading the CSV, use the below lightning framework Tag for that.
<lightning:input/> (https://developer.salesforce.com/docs/component-library/bundle/lightning:input/example#lightningcomponentdemo:exampleInputFile)
<lightning:input class=”slds-align_absolute-center”
aura:id=”fileId” onchange=”{!c.handleFilesChange}”
type=”file” name=”file” multiple=”false”/>
Where handleFilesChange method will perform actions on JS controller after File upload.
You can add file validations on handleFilesChange method based on your requirements:
E.g. Below Code Snippet Checks the File Extension before Proceeding Further.
handleFilesChange: function(component, event, helper) {
if (event.getSource().get(“v.files”).length > 0) {
fileName = event.getSource().get(“v.files”)[0][‘name’];
if(fileName.substr(fileName.length – 3).toUpperCase() != ‘CSV’ ){
fileName = ‘Please select a file with CSV Extension’;
}
}
},
- On your components initialization function(init), fetch the custom metadata mapping records and store in a map attribute of component.
<aura:attribute name=”mapCSVColumnToSFFeild” type=”Map” />
To Read CSV File
- Use the JS FileReader and read the uploaded file data as Text on reader.
- Loop through each row of text data using Reader and process each row of CSV data to create a JS object that holds data like your Custom_Object__c.
- While comparing the uploaded CSV Columns with custom metadata mapping, try to remove spaces and convert them to uppercase to avoid issues due to Javascript’s case sensitivity.
- Review the below code snippets with comments to understand the read functionality example :
readCSVData : function(component, event, helper) {
//List of allowed CSV column Names with Spaces Removed
var listAcceptedFields = component.get(“v.listOfCSVLabels”);
//Converting above list to Uppercase
var listAcceptedFieldsUpperCase = listAcceptedFields.map(function(x){ return x.toUpperCase() })
//Map of CSV Column Name to Salesforce Field API name
var mapNameToAPIname = component.get(“v.mapCSVColumnToSFFeild”);
//To get the uploaded File Data
var files = component.find(‘fileId’).get(“v.files”);
//To hold the Text Data of Reader
var textData ;
//The hold the list of row Data similar to the custom object record list.
var rowsList =[];
var reader = new FileReader();
//to check Empty Object
function isEmpty(obj) {
for(var key in obj) {
if(obj.hasOwnProperty(key))
return false;
}
return true;
}
//BONUS : A Lot of people forget to add this important check on CSV reading
//that is to handle the cell valueValues with Comma and New line.
Source :
function csvToArray(text) {
let p = ”, row = [”], ret = [row], i = 0, r = 0, s = !0, l;
for (l of text) {
if (‘”‘ === l) {
if (s && l === p) row[i] += l;
s = !s;
} else if (‘,’ === l && s) l = row[++i] = ”;
else if (‘\n’ === l && s) {
if (‘\r’ === p) row[i] = row[i].slice(0, -1);
row = ret[++r] = [l = ”]; i = 0;
} else row[i] += l;
p = l;
}
return ret;
};
reader.onload = function() {
var text = reader.result;
textData = text;
//Calling the Above Method
var rows = csvToArray(textData);
//First Row of CSV
var header = rows[0];
/* To Ignore the first row (header) and start from second*/
for (var i = 1; i < rows.length; i = i + 1) {
//Set of Cells that define a row
var cells = rows[i];
if(cells.length!=1){
//to store each row data in Object Format
var rowData = {};
//Looping based on Number of Columns
for(var j = 0; j < header.length; j = j + 1){
if(typeof header[j] != ‘undefined’){
//To check If the Column is Acceptable
if(listAcceptedFieldsUpperCase.indexOf(header[j].replace(/\s/g, ”).toUpperCase()) > -1){
//if cell has data
if(cells[j]){
//Add the Cell data under the Salesforce Field API name Property of RowData Object.
rowData[mapNameToAPIname[header[j].replace(/\s/g, ”)]] = cells[j];
}
}
}
}
if(!isEmpty(rowData)){
rowData[‘sobjectType’]=’Custom_Object__c’;
rowsList.push(rowData);
}
}
}
//Save the records list on component Attribute
component.set(“v.listRecordToUpload”,rowsList);
//upload data to salesforce
window.setTimeout(
$A.getCallback(function() {
//Calling the helper Function to Upsert the records on server
helper.uploadToSalesforce(component, event, helper);
}), 2000);
};
if (files[0] !== undefined && files[0] !== null && files[0] !== ”) {
reader.readAsText(files[0]);
}
- You can simply create a method using JS action that sends the listRecordToUpload to Salesforce’s apex and later apex UPSERT those records on the server.
Step 3 : Bake the Cake : Compile and Run the Code
- Build your own code using the above logic.
- The code can be compiled and checked for syntax errors on IDE (Dev Console, VS code etc).
- To run the component, you can use one of the following :
- Create a Lightning Component based custom action on any object’s record and execute your component using that action.
- A better way is to create a lightning app, VF page and using ltng:outApp to run the component with VF page preview.
- https://developer.salesforce.com/docs/atlas.en-us.lightning.meta/lightning/components_visualforce.htm
Step 4 : Testing (You won’t need a toothpick here to check how well baked your code is!)
- While writing your code you can frequently test the changes you made by running the component on the Salesforce platform using the above methods.
- For precise debugging of lightning components, use Salesforce Lightning Inspector Chrome extension. https://chrome.google.com/webstore/detail/salesforce-lightning-insp/pcpmcffcomlcjgpcheokdfcjipanjdpc?hl=en
- Try to build the component on your own and test with uploading different types of data in CSV files.
Step 5 : The Icing on the cake
- Add Logos, warning, errors, alerts, pop-ups, divs, style, text etc to design the component as per your requirements.
Limitation:
Just like you limit the calories of your cake to prevent adverse health effects, similarly lightning components have to be built by taking care of your Salesforce Org health. For example,the process should be within the governing limits to avoid the timeout and processing limit failure.
For the above component, the file size limit to upload and the number of CSV Records that can be processed, will be based on the governing limits of your Salesforce Org.
Enjoy the Process, Keep Learning!