AWS DynamoDB JSON Data Import: Some Tips and Tricks
- Check your profile and region in the
.aws/credentials
file. - Check your data file. It should be formatted according to the BatchWriteItem JSON format.
- Check if your table exists in AWS DynamoDB. Else if , create the table first.
Trick: If you try to import from AWS S3, the table will be created, but it will not be imported the data. Even though there are no errors in json file, it throws a JSON validation error. In this case, you can import the data using AWS CLI.
Sample table : ProductCatalog (Id , Title , ISBN, Authors, Price …)
BatchWriteItem JSON format sample
{
"ProductCatalog": [
{
"PutRequest": {
"Item": {
"Id": {
"N": "101"
},
"Title": {
"S": "Book 101 Title"
},
"ISBN": {
"S": "111-1111111111"
},
"Authors": {
"L": [
{
"S": "Author1"
}
]
},
"Price": {
"N": "2"
},
"Dimensions": {
"S": "8.5 x 11.0 x 0.5"
},
"PageCount": {
"N": "500"
},
"InPublication": {
"BOOL": true
},
"ProductCategory": {
"S": "Book"
}
}
}
},
{
"PutRequest": {
"Item": {
"Id": {
"N": "102"
},
"Title": {
"S": "Book 102 Title"
},
"ISBN": {
"S": "222-2222222222"
},
"Authors": {
"L": [
{
"S": "Author1"
},
{
"S": "Author2"
}
]
},
"Price": {
"N": "20"
},
"Dimensions": {
"S": "8.5 x 11.0 x 0.8"
},
"PageCount": {
"N": "600"
},
"InPublication": {
"BOOL": true
},
"ProductCategory": {
"S": "Book"
}
}
}
},
{
"PutRequest": {
"Item": {
"Id": {
"N": "103"
},
"Title": {
"S": "Book 103 Title"
},
"ISBN": {
"S": "333-3333333333"
},
"Authors": {
"L": [
{
"S": "Author1"
},
{
"S": "Author2"
}
]
},
"Price": {
"N": "2000"
},
"Dimensions": {
"S": "8.5 x 11.0 x 1.5"
},
"PageCount": {
"N": "600"
},
"InPublication": {
"BOOL": false
},
"ProductCategory": {
"S": "Book"
}
}
}
},
{
"PutRequest": {
"Item": {
"Id": {
"N": "201"
},
"Title": {
"S": "18-Bike-201"
},
"Description": {
"S": "201 Description"
},
"BicycleType": {
"S": "Road"
},
"Brand": {
"S": "Mountain A"
},
"Price": {
"N": "100"
},
"Color": {
"L": [
{
"S": "Red"
},
{
"S": "Black"
}
]
},
"ProductCategory": {
"S": "Bicycle"
}
}
}
},
{
"PutRequest": {
"Item": {
"Id": {
"N": "202"
},
"Title": {
"S": "21-Bike-202"
},
"Description": {
"S": "202 Description"
},
"BicycleType": {
"S": "Road"
},
"Brand": {
"S": "Brand-Company A"
},
"Price": {
"N": "200"
},
"Color": {
"L": [
{
"S": "Green"
},
{
"S": "Black"
}
]
},
"ProductCategory": {
"S": "Bicycle"
}
}
}
},
{
"PutRequest": {
"Item": {
"Id": {
"N": "203"
},
"Title": {
"S": "19-Bike-203"
},
"Description": {
"S": "203 Description"
},
"BicycleType": {
"S": "Road"
},
"Brand": {
"S": "Brand-Company B"
},
"Price": {
"N": "300"
},
"Color": {
"L": [
{
"S": "Red"
},
{
"S": "Green"
},
{
"S": "Black"
}
]
},
"ProductCategory": {
"S": "Bicycle"
}
}
}
},
{
"PutRequest": {
"Item": {
"Id": {
"N": "204"
},
"Title": {
"S": "18-Bike-204"
},
"Description": {
"S": "204 Description"
},
"BicycleType": {
"S": "Mountain"
},
"Brand": {
"S": "Brand-Company B"
},
"Price": {
"N": "400"
},
"Color": {
"L": [
{
"S": "Red"
}
]
},
"ProductCategory": {
"S": "Bicycle"
}
}
}
},
{
"PutRequest": {
"Item": {
"Id": {
"N": "205"
},
"Title": {
"S": "18-Bike-204"
},
"Description": {
"S": "205 Description"
},
"BicycleType": {
"S": "Hybrid"
},
"Brand": {
"S": "Brand-Company C"
},
"Price": {
"N": "500"
},
"Color": {
"L": [
{
"S": "Red"
},
{
"S": "Black"
}
]
},
"ProductCategory": {
"S": "Bicycle"
}
}
}
}
]
}
If the table is already created on DynamoDB, you can run the following command (in the same directory as the file) to import the data.
> aws dynamodb batch-write-item --region *YOUR_REGION_HERE* --profile *YOUR_PROFILE_HERE* --
request-items file://YOUR_DATA_FILE.json
If all the data is successfully imported, the response will look like this.

You can also check and verify the table contents from the AWS Console.
