All the columns created in the Table Definition screen will be displayed on the Data screen. On this screen, you can perform the following operations –
- Import Rows
- Search Data
- Update Row
- Add Row
- Delete Rows
- Truncate Table
- Export Data
How Import Rows Works
To populate your table with data –
- First, you’ll have to import the CSV source file.
- Map the columns in the CSV file to the corresponding column within the table.
- Then, validate the CSV file. The validations that might show up are Validation Success and Validation Failed.
- Once the status is Validation Success or Validation Failed (Partially), click the Import button. This will start importing the data into the table.
To import a CSV source file to your table –
- Navigate to the Data screen, and on the right of your screen, hover your cursor over the icon, and click Import Rows.
- Then, click Download Sample. The sample file is generated dynamically and it contains the schema names of all the columns created in the Table Schema.
- On the sample CSV file, enter relevant data in the correct data type under each column, and save the file.
- If you are not using the sample CSV file, and are instead creating a CSV file from scratch, then specify the header names (column names) in the first row, and add data across the subsequent rows.
- Click Upload, and add the relevant CSV file.
- Alternatively, you can drag-and-drop the CSV file.
- Once you’re done, click Next.
- The only file format supported is a “.csv” file.
- The maximum file size allowed is 50MB.
- Column with the name ‘__LSQValidationMessage’ or ‘__LSQValidationStatus’ is not allowed in the CSV file.
Once you’ve imported the CSV file, you will have to map it against the corresponding column within the table.
- If the column names in the CSV file are the same as the column names in the table column, then the columns are automatically mapped.
- Else, from the CSV Column Name dropdowns, you must map the columns to the correct Column Name.
- If DateTime columns are present in the Table Schema –
- From the Date/Time format in CSV section, select the same Date/Time/DateTime formats given in CSV File.
- If table schema contains either of Date/DateTime/Time columns, or its combinations, but not all three columns, then in Date/Time format in CSV section, you can choose only those available column formats. Other formats will be disabled.
- Below are the default formats –
- Date – yyyy-MM-dd
- DateTime – yyyy-MM-dd hh:mm:ss
- Time – hh:mm:ss
- If there are no Date/Time/DateTime fields in the table schema, then this section will remain greyed out and inaccessible.
- Once you’re done, click Next.
- CSV column names are case-sensitive.
- Different DateTime fields in the CSV file should follow one uniform format. You cannot upload different format DateTime fields in the same CSV file.
- For example, in your CSV file, you cannot have one column in the dd-MM-yyyy HH:mm:ss format, and another column in the MM/dd/yyyy HH:mm:ss format.
- Date, Time and DateTime fields will be stored in the default formats in the Mavis DB.
- Non-Nullable and Unique columns will be displayed before other columns.
The data in the CSV file will be validated on this screen.
If Validation Success –
If all the row(s) have been successfully validated, you’ll get the Validation Success message. You can choose to continue the CSV file import process by clicking Next.
If Validation Failed –
If one or all the rows have not been successfully validated, you’ll get the Validation Failed message. There are two types of Validation Failed messages you might receive –
- Partially Failed – This means the validation for only a few rows in the CSV file failed, while the remaining rows’ validation was successful.
- Completely Failed – This means the validation for all the rows in the CSV file failed.
To rectify a failed validation –
- You’ll get the following details about the rows that were validated –
- Total Rows
- Success Rows
- Failed Rows
- Here, you can view the rows that failed during validation.
- If you want to ignore the Failed Rows error and proceed with only uploading the Success Rows in the file import process, you can do so by clicking Next.
- To download the failure report, click Failure Report.
- In the Failure Report file, you’ll see two new system-generated columns have been added – “__LSQValidationMessage” and “__LSQValidationStatus”
- The “__LSQValidationMessage” column will state the reason as to why the validation was unsuccessful for that particular row.
- The “__LSQValidationStatus” will display the validation status (“Success” or “Failed”) for that particular row.
- Once you’ve rectified the error in the original CSV file, save the file, and re-upload it from the Source screen.
- If all the rows have been successfully validated, you’ll get the Validation Success message. You can choose to continue the CSV file import process by clicking Next.
All CSV files that are imported are usually queued within our servers. It may take a few minutes for these files to be successfully imported.
In the Summary screen, you’ll get a message stating, “You have successfully imported the file.” This means your file is currently queued for import, and it will be completed within a few minutes. On this screen, you can perform the following actions –
- Check status – To know the status of the file import, click the Check status button present at the bottom of the screen. This will redirect you to the Request History screen, where you can view the current import status. The following statuses are available –
- Initiated – This means the file import has been successfully initiated. This message will be displayed until the file import is completed.
- Validation In-Progress – This means the uploaded file is currently being validated.
- Validation Success – This means the uploaded file contains no errors and is successfully validated.
- Validation Failed – This means the uploaded file contains errors, and the validation was unsuccessful. Examples of errors could be a String datatype that has more than 500 characters, an Integer datatype with a decimal value, or if a Coordinate Latitude falls outside of -90 to 90 degrees or any such errors for other datatypes.
Table Operation Queued – The Data Import is in the queue, and the data will be inserted into the table within a few minutes.
Table Operation In-Progress – This means the data is currently being pushed to the table.
Table Operation Failed – This means there was an error processing the CSV file. In the following scenarios, the status will change to Table Operation Failed –
- If the CSV file contains duplicate column names
- If the CSV file is corrupted
- If the CSV file does not contain any data
- If the CSV file is not CSV UTF-8 encoded
- If the CSV file does not contain the “__LSQValidationMessage” or “__LSQValidationStatus” columns
- An error occurred while processing the file
Table Operation Success – This means the data got successfully inserted into the table.
- View data – To view the data that was successfully uploaded (or if the file import is still in progress, then you can view the data that has been successfully imported until then), click View data.
- Import new file – To import a new CSV file, click Import new file. This will redirect you to the Source screen, from where you can import the CSV file.
Note: When the Data Import is in progress –
- All data operations (such as Add Rows, Update Rows, Delete Rows and Truncate Table) requests will be queued, until the data is imported.
- You will not be able to Add, Edit or Delete a column.
- You will not be able to perform the Restore Table action.
Once the data has been successfully uploaded, you can find specific data within the table. To search within the data that’s been uploaded –
- On the Data screen, click Advanced Search.
- On the Data Search Options pop-up, select the relevant filter conditions – All Criteria (And) or Any Criteria (Or).
- From the Column Name dropdown, select the column within which you want to carry out the search. Only columns marked as “Is Queryable” will show up in the dropdown.
- From the Operator dropdown, select an appropriate value. The approved relational operators for each datatype are listed below.
- In the Value field, enter a relevant value.
- You can delete a search condition by clicking , present under Actions.
- To add further search conditions, click the Add button.
- Once you’re done adding conditions, click Search. This will return the relevant data field(s).
- Only columns marked as “Is Queryable” can be used in the search filter.
- Coordinate, JSON, Text columns cannot be marked as “Is Queryable”.
- The search value is case-sensitive.
The following Relational Operators are available on the Data Search Options pop-up for each data type –
|Datatype||Equals||Not-Equals||Starts With||Ends With||Contains||Greater Than Equal To||Less Than Equals To||Greater Than||Less Than||Between||Exists||Not Exists|
To update an individual row within the table –
- Select the row you want to update by clicking the checkbox.
- Hover your cursor over the menu, and click Update Row.
- On the Update Row screen, update the values in the relevant fields.
- Once you’re done, click Update. You’ll get a message saying, “Row updated successfully”.
- To undo the changes, click Cancel.
Note: You can only update one single row at a time.
To add an individual row within the table –
- Hover your cursor over , and click Add Row.
- On the Add Row screen, enter values in the relevant fields, and click Add. The row is successfully saved.
Note: You can only add one single row at a time.
To delete up to 200 rows within the table –
- Select the rows you want to delete by clicking the checkbox against those rows.
- To select bulk rows, increase the number of rows in a page from the Show dropdown, and select all the rows by clicking the checkbox in the table header.
- Hover your cursor over , and click Delete Rows.
- On the Delete Rows pop-up, click Delete.
- To cancel deleting the rows, click Cancel.
To delete more than 200 rows –
- You must apply relevant Advanced Search filters.
- After applying it, click on top of the grid.
- Hover your cursor over , and click Delete Rows.
- On the Delete Rows pop-up, to delete all the rows available under this filter, select Delete all filtered rows, and click Delete
- To only delete rows present on that particular page, select Delete filtered rows on this page.
Note: If you have selected more than 200 rows to delete, the request will then be queued, and its status can be tracked in the Request History screen.
This action will delete all the data stored in that particular table.
- Hover your cursor over the menu, and click Truncate Table.
- On the Truncate Table pop-up, enter the table name you want to truncate.
- Then, click Confirm. You’ll get a message saying, “Truncate table request has been accepted”, and you will automatically be redirected to the Request History page.
- To cancel truncating the table, click Cancel.
To export the entire table out of LeadSquared –
- Hover your cursor over , and click Export Data.
- On the Export Data pop-up, click Export.
- On the Export Data popup, click here. This redirects you to the Request History screen.
- On the Request History screen, against the Export Table request, hover your cursor over, and click Download.
To know how to audit a table, please refer to Table Audit.