This is a simple JavaScript library that simplify the process of uploading data from large excel files to the server.
Instead of first uploading the entire file to the server and then processing it, this library will save you some bandwidth by
processing the file on the local machine and uploading the extracted data (which is what you actually want) to the server in batch.
Because it does the upload in batches, issues such as connection timeout is taken care of. In addition to this, it reports the
data that cause an exception on the server and make them available for download as an Excel file.
Read the comprehensive guide here
Add the following to your page
<!-- required -->
<script src="https://ajax.googleapis.com/ajax/libs/jquery/3.2.1/jquery.min.js"></script>
<!--- optional -->
<script src="https://maxcdn.bootstrapcdn.com/bootstrap/4.0.0/js/bootstrap.min.js"></script>
<script src="https://unpkg.com/sweetalert/dist/sweetalert.min.js"></script>
<!-- needed by older browsers -->
<script src="https://github.com/eligrey/Blob.js"></script>
<!-- Required -->
<script src="https://rawgit.com/eligrey/FileSaver.js/src/FileSaver.js"></script>
<script src="https://unpkg.com/xlsx/dist/xlsx.full.min.js"></script>
<script src="https://cdn.rawgit.com/SeunMatt/excel_uploader/4f4ebd93/src/excel_uploader.js"></script>
Then initialize the Uploader like this:
<script>
$(document).ready( function () {
new ExcelUploader({
maxInAGroup: 1000,
serverColumnNames: ["Name", "Email", "Phone Number"],
importTypeSelector: "#dataType",
fileChooserSelector: "#fileUploader",
outputSelector: "#tableOutput",
extraData: {_token: "23333323323223323232"}
});
});
</script>
From the snippet above, we instantiate the ExcelUploader object with configuration parameters. Following is an explanation of each parameter and what it does.
maxInAGroup: This parameter controls how many records to be sent to the server per time. The default is 1000 so if there are 5000 records in the source excel file, the data will be uploaded to the server in 5 batches. It can be fine-tuned to our taste base on available resources.
serverColumnNames: This is an array of the columns the server is expecting. Every time excel_uploader pushes data to the server, it also sends a column_map.
The column map is generated by using the lowercase form of the elements in serverColumnNames as the key and the index of the corresponding excel column as value.
For example, if the server is expecting the uploaded excel data to contain 3 columns — Name, Email and Phone Number.
The user will first upload an excel file, then map which column represents what data in the excel. Then excel_uploader will compose the server column names using the lowercase of the serverColumnNames, replacing white spaces with an underscore (_) character.
So the server at the end will receive a JSON payload like this:
{
"column_map":{"name":0,"email":2,"phone_number":1},
"data":[
["name1", "08010xxx", "email1@g.com"],
["name2", "08020xxx", "email2@g.com"],
["name3", "08030xxx", "email3@g.com"],
...
]
}
Which can be interpreted as the column name is at index zero (0) of the data, email is at index two (2) and phone_number is at index 1.
importTypeSelector: This is the jQuery selector for the HTML select field. The value of the Select options is the actual URL to send the excel data to if that option is selected. This way it’s possible to use the same page for multiple uploads. Just add the destination URL as an option in the Import Type Selector.
fileChooserSelector: This is the jQuery selector for the HTML file input field. For actually choosing the excel file from local storage by the user.
outputSelector: This is the jQuery selector for the HTML div that is used for displaying progress, status and error messages.
extraData: This is the data to be sent to the server alongside the uploaded data and column mapping. For example, CSRF token. It is an optional parameter.
Read the comprehensive guide here.
The server is expected to process the data in a transaction and return any set of data that causes an exception as an array of array with the data key e.g.
{
data : [ ["data1", "data2", "data3"] . . .]
}
It's very important that the response is a JSON type and contains a payload like the one above.
If there were other errors like invalid data or incomplete data. The server can return a JSON with an error entry:
{
error : "This is an error message"
}
Note that the response code should be 200. See the demo code for more info.
And that's all! Happy Bulk Data Processing.
Seun Matt - https://smattme.com
Love this library? You can support by buying me a coffee ☕
Wanna add a feature? or an improvement? Kindly submit a PR. It's advisable to involve the author from the get go. Cheers
Don't forget to star the repo and spread the word.