Lab 1.
Exploring AWS CloudShell and the AWS Cloud9 IDE
Accessing the AWS Management Console
1. Click Start Lab to launch your lab.
2. Wait for the message Lab status: ready and close the panel.
3. Click AWS at the top to open the AWS Management Console in a new tab.
4. If a pop-up is blocked, allow pop-ups in your browser settings.
5. Arrange the AWS Management Console tab alongside these instructions for easy
reference.
6. To expand the instructions, uncheck the Terminal option in the top right corner.
Task 1: Exploring AWS CloudShell
1. Click the AWS CloudShell icon at the top of the AWS Console.
2. If a "Welcome to AWS CloudShell" pop-up appears, close it.
3. Wait 1–2 minutes for the terminal to become available.
4. Verify AWS CLI installation by running: aws --version
5. Test AWS CLI by running: aws s3 ls (This lists existing S3 buckets.)
6. Open another terminal panel: Actions menu > Tabs layout > Split into columns.
7. Download the Python script list-buckets.py using the right-click menu.
8. Upload the file to CloudShell: Actions menu > Files > Upload file > Select file.
9. Confirm the upload is successful, then run: cat list-buckets.py
10. Execute the script: python3 list-buckets.py (It should list S3 buckets.)
11. Copy the -sample-bucket- name from the output.
12. Upload the file to S3 by running: aws s3 cp list-buckets.py s3://<bucket-name>.
13. Verify successful upload (output should confirm the upload path).
Task 2: Exploring VS Code IDE
1. Click Details > AWS: Show at the top.
2. Copy LabIDEURL and LabIDEPassword to a text editor.
3. Open a new browser tab and paste LabIDEURL.
4. Enter LabIDEPassword and click Submit.
5. Observe the VS Code IDE layout:
o Bottom: Bash terminal (similar to CloudShell)
o Left: Navigation pane (file system)
o Runs on: Amazon EC2 instance
6. Get the S3 bucket name: aws s3 ls
7. Download list-buckets.py from S3: aws s3 cp s3://<bucket-name>/list-buckets.py .
8. Open list-buckets.py in the editor (double-click it in the navigation pane).
9. Run the script: python3 list-buckets.py
10. If an error occurs (ModuleNotFoundError: No module named 'boto3'), install it: sudo
pip3 install boto3
11. Re-run the script: python3 list-buckets.py (It should now work.)
12. Create a new file: File > New Text File.
13. Add the following text: <body> Hello World. </body>
14. Save as index.html in /home/ec2-user/environment/.
15. Upload to S3: aws s3 cp index.html s3://<bucket-name>/index.html
16. Verify successful upload (output should confirm the upload path).
Lab 2. Working with Amazon S3Orchestrating Serverless Functions with
AWS Step Functions
Task 1: Connecting to VS Code IDE and Configuring the Environment
1. Connect to VS Code IDE:
➢ Click Details > AWS: Show
➢ Copy LabIDEURL and LabIDEPassword to a text editor.
➢ Open a new browser tab, paste LabIDEURL, and press Enter.
➢ On the Welcome to code-server screen, enter LabIDEPassword and click
Submit.
2. Install AWS SDK for Python:
➢ Open the VS Code bash terminal (bottom of IDE).
➢ Run: sudo pip3 install boto3
3. Download and Extract Lab Files:
➢ Run: wget https://aws-tc-largeobjects.s3.us-west-2.amazonaws.com/CUR-TF-
200-ACCDEV-2-91558/02-lab-s3/code.zip -P /home/ec2-user/environment
➢ unzip code.zip
4. Verify AWS CLI Version: aws --version
Task 2: Creating an S3 Bucket Using the AWS CLI
1. Create an S3 Bucket:
➢ Choose a bucket name in this format: <your-initials>-YYYY-MM-DD-s3site
➢ aws s3api create-bucket --bucket <bucket-name> --region us-east-1
➢ Save the bucket name in a text file for later use.
2. Verify Bucket in AWS Console:
➢ Go to AWS Management Console > S3.
➢ Search for your bucket name and confirm its creation.
Task 3: Uploading Files to S3
1. Copy HTML Files to S3 Bucket:
➢ Navigate to the lab directory: cd /home/ec2-user/environment/code
➢ Run: aws s3 cp index.html s3://<bucket-name>/index.html
➢ aws s3 cp error.html s3://<bucket-name>/error.html
2. Confirm File Upload:
➢ Run: aws s3 ls s3://<bucket-name>/
➢ Ensure index.html and error.html are listed.
Task 4: Configuring S3 Bucket for Static Website Hosting
1. Enable Static Website Hosting:
➢ Run: aws s3 website s3://<bucket-name>/ --index-document index.html --error-
document error.html
2. Set Bucket Policy:
➢ Create a new JSON file: nano policy.json
➢ Add , content Save and exit (CTRL + X, Y, Enter).
➢ Apply the policy: aws s3api put-bucket-policy --bucket <bucket-name> --policy
file://policy.json
3. Access the Static Website:
➢ Run: echo "http://<bucket-name>.s3-website-us-east-1.amazonaws.com"
➢ Open the provided URL in a browser to view the static website.
Task 5: Cleaning Up Resources
1. Delete Files from S3 Bucket:
o Run: aws s3 rm s3://<bucket-name>/ --recursive
2. Delete the S3 Bucket:
o Run: aws s3api delete-bucket --bucket <bucket-name>
3. Confirm Deletion:
o Run: aws s3 ls
o Ensure the bucket no longer appears in the list.
Lab 3. Working with Amazon DynamoDB
Task 1: Preparing the lab
1. Connect to VS Code IDE:
o Click "Details" and choose "AWS: Show."
o Copy LabIDEURL and LabIDEPassword to a text editor.
o Open a new browser tab and paste LabIDEURL.
o Enter LabIDEPassword and click Submit to open VS Code.
2. Download and extract required files:
o Open the VS Code Bash terminal (bottom of the IDE) and run:
o wget https://aws-tc-largeobjects.s3.us-west-2.amazonaws.com/CUR-TF-200-
ACCDEV-2-91558/03-lab-dynamo/code.zip -P /home/ec2-user/environment
o Verify code.zip appears in the left panel.
o Extract files: unzip code.zip
o Check that the resources folder appears in the left panel.
3. Upgrade AWS CLI and verify installations:
o Run the following to set permissions and execute the setup script:
o chmod +x ./resources/setup.sh && ./resources/setup.sh
o Verify AWS CLI version: aws --version
o Verify Boto3 installation: pip3 show boto3
Task 2: Creating a DynamoDB table using Python SDK
1. Check for existing tables:
o Open AWS Management Console.
o Search for and select DynamoDB.
o Click Tables and confirm that no tables exist.
2. Edit the script to create the table:
o Return to VS Code IDE.
o Expand the python_3 directory.
o Open create_table.py.
o Replace <FMI_1> with: 'FoodProducts'
3. Understand the script:
o The script configures the DynamoDB resource and sets the AWS region:
o DDB = boto3.resource('dynamodb', region_name='us-east-1')
o It creates the table using: table = DDB.create_table(**params)
4. Run the script to create the table:
o In VS Code Bash terminal, navigate to the script folder: cd python_3
o Run the script: python3 create_table.py
5. Verify table creation:
o Run the following command:
o aws dynamodb list-tables --region us-east-1
o The output should confirm the FoodProducts table exists:
o Return to the DynamoDB console, refresh, and confirm the table status is
Active.
Task 3: Working with DynamoDB Data – Understanding DynamoDB Condition Expressions
Step 1: Insert the First Record
1. Open not_an_existing_product.json in the VS Code IDE.
2. Review the JSON file, which contains product_name and product_id attributes.
3. Run the following AWS CLI command to insert the record: aws dynamodb put-item \
4. Verify that the record was added in the DynamoDB console under "Explore table items."
Step 2: Add a Second Record
1. Modify not_an_existing_product.json:
o Change product_name from <best cake> to best pie.
o Keep product_id unchanged.
2. Run the AWS CLI command again.
3. Verify that both records exist in DynamoDB.
Step 3: Insert a Duplicate Record
1. Re-run the same AWS CLI command without modifying the JSON file.
2. Check the DynamoDB console – no new record is created because the primary key
exists, and the existing record is overwritten.
Step 4: Insert a Record with an Existing Primary Key but a Different Product ID
1. Modify not_an_existing_product.json:
o Keep product_name the same.
o Change product_id to 3333333333.
2. Run the AWS CLI command again.
3. Verify in DynamoDB – the existing product_id is replaced with 3333333333.
Step 5: Prevent Overwriting with Condition Expressions
1. Modify not_an_existing_product.json:
o Keep product_name the same.
o Change product_id to 2222222222.
2. Run the following AWS CLI command:
3. The command returns an error (ConditionalCheckFailedException), meaning the record
was not overwritten.
Task 4: Adding and Modifying Items Using the SDK
Step 1: Update and Run conditional_put.py
1. Open conditional_put.py in VS Code.
2. Replace <FMI> placeholders with: 'product_name': {'S': 'apple pie'},'product_id': {'S':
'a444'}, 'price_in_cents': {'N': '595'}, 'description': {'S': "It is amazing!"}, 'tags': {'L': [{'S':
'whole pie'}, {'S': 'apple'}]}
3. Ensure the condition expression is included:
4. ConditionExpression='attribute_not_exists(product_name)'
5. Run the script: python3 conditional_put.py
6. Verify that the apple pie record was added to DynamoDB.
Step 2: Attempt to Modify the Existing Item
1. Change product_id from a444 to a555.
2. Run python3 conditional_put.py.
3. Verify in DynamoDB – the item remains unchanged.
Step 3: Insert a New Item
1. Change product_name from apple pie to cherry pie.
2. Run python3 conditional_put.py.
3. Verify in DynamoDB – a new record for cherry pie is added without modifying apple pie.
Task 5: Adding multiple items by using the SDK and batch processing
Step 1: Delete All Existing Records
1. Open the DynamoDB Item Explorer and refresh the data by clicking Run.
2. Select all the records in the table.
3. Click Actions → Delete item(s).
4. Confirm the deletion by typing Delete and clicking Delete items.
Step 2: Review the Test Data
1. In VS Code, open resources/test.json.
2. This file contains six product records, including multiple entries for apple pie.
Step 3: Update the Batch Load Script
1. Open python_3/test_batch_put.py in VS Code.
2. Update the placeholders:
o Replace <FMI_1> with FoodProducts (table name).
o Replace <FMI_2> with product_name (primary key).
3. Save and close the file.
Step 4: Run the Batch Load Script
1. Open the terminal in VS Code.
2. Run the following command:
3. python3 test_batch_put.py
4. The terminal should display multiple entries being added, with the last apple pie price
replacing the earlier ones.
Step 5: Modify Script to Prevent Overwriting
1. Open test_batch_put.py again.
2. Change this line: with table.batch_writer(overwrite_by_pkeys=['product_name']) as
batch: to with table.batch_writer() as batch:
3. Save and close the file.
Step 6: Run the Modified Script and Observe Errors
1. Run the script again:
2. python3 test_batch_put.py
3. You should see a ValidationException error:
4. Provided list of item keys contains duplicates
5. No records are added to DynamoDB, preventing incorrect values.
Step 7: Prepare for Production Data Load
1. In DynamoDB Console, delete all records again.
2. Open resources/website/all_products.json in VS Code.
3. This file contains multiple items, including an optional specials attribute.
Step 8: Run the Production Data Load
1. Open python_3/batch_put.py in VS Code.
2. Replace <FMI> with FoodProducts.
3. Save and close the file.
4. Run the script: python3 batch_put.py
5. The terminal should display multiple records being added.
Step 9: Verify the Data in DynamoDB
1. Open DynamoDB Item Explorer.
2. Refresh and scan the table.
3. You should see 26 or more records added successfully!
Task 6: Querying the Table Using the SDK
Step 1: Edit the Script to Select All Records
1. Open VS Code IDE and navigate to python_3 > get_all_items.py.
2. Do not use the get_all_items.py file in the resources folder.
3. Replace <FMI_1> with FoodProducts (the table name).
4. Save and close the file.
Step 2: Review the get_all_items.py Script
1. Line 15: Defines the scan operation.
2. Line 18: Starts a while loop to handle large scan results (pagination).
3. The while loop ensures all records are retrieved by appending pages of data:
Step 3: Run the Script
1. Open the VS Code terminal.
2. Run the command: python3 get_all_items.py
3. The output will show all items in JSON format.
Step 4: Query a Single Product
Now, Sofía wants to retrieve a specific product instead of all items.
Update the get_one_item.py Script
1. Open get_one_item.py.
2. Replace <FMI_1> with product_name (the primary key).
3. Save and close the file.
Step 5: Review the get_one_item.py Script
1. Lines 13-14: Define the get_item() operation to fetch a product by name.
2. Line 24: Assigns a value to product and passes it to get_one_item().
Step 6: Run the get_one_item Script
1. Open the VS Code terminal.
2. Run the command: python3 get_one_item.py
3. The output will display only the requested item.
Task-7 : Steps to Add the GSI and Filter the Query
1. Update add_gsi.py to Add the GSI
• Modify the script by replacing <FMI_1> with "HASH".
• Ensure the special_GSI index is defined correctly in the GlobalSecondaryIndexUpdates
section.
• Run the script in your VS Code terminal: python3 add_gsi.py
• Check the DynamoDB Console:
o Navigate to Tables > FoodProducts > Indexes tab.
o Wait for the special_GSI status to change from Creating to Active.
2. Update scan_with_filter.py to Query the New Index
• Modify the script:
o Replace <FMI_1> with "special_GSI".
o Replace <FMI_2> with "tags".
• Run the script: python3 scan_with_filter.py