Modern Data Architectures for AWS
Modern Data Architectures for AWS
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 1
MODERN DATA ARCHITECTURE FOR CLOUD-NATIVE DEVELOPMENT
Table of contents
A modernized database is efficient, agile, and scalable …..……………..….………….… 3
Modern applications ……………..……………………………………..………………..….… 4
What is cloud-native? Why does it matter? …………………………………………...….… 6
Modern data management for cloud-native development ……….…….………….……. 7
Introducing modern application architectures …………………..…………………..……. 9
Purpose-built databases: NoSQL ……………………………………………......…..….…. 12
DevOps for databases ……………..….…..….…………………………………….……..… 22
Key takeaways for Database DevOps ……...……………………..….……………….…… 25
Implementing modern data management processes and tools .…………….…………26
Ingesting logs and metrics ………………………….………………………...……...…...… 36
Recap: Modern data architecture ………………………….…………………..………...… 38
Implement your modern data management strategy today ………………...……....… 39
Conclusion ………………………………………………………………...………………...… 40
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 2
A modernized database is
efficient, agile, and scalable
Our goals as developers are to deliver business value faster and create products
that help us stay competitive in the market. To achieve these goals, we need
to modernize by adopting new technologies and practices.
In this chapter, you’ll learn the importance of test automation, data management,
and data democratization in enterprises, and how to integrate data changes into
your Continuous Integration/Continuous Delivery (CI/CD) pipeline.
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 3
Modern applications
Modern applications were born out of a necessity to deliver smaller
features faster to customers. While this directly addresses only the
application architecture aspect, it also forces other teams to build and
execute in a similar manner. In order to continuously deliver features,
organizations need all cross-functional teams to operate as “One Team.”
Key aspects of
modern applications:
• Use independently
scalable microservices such
as serverless and containers
• Connect through APIs
• Deliver updates continuously
• Adapt quickly to change
• Scale globally
• Are fault tolerant
• Carefully manage
state and persistence
• Have security built in
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 4
Modern applications require more
performance, scale, and availability
Modern applications are pushing boundaries. Users are connected
all the time and expect microsecond latency and access from mobile
and internet of things (IoT) devices. They also demand that they
have the ability to connect from anywhere they happen to be.
e-commerce Media Social Online Shared Development ...….. Apps and storage
streaming media gaming economy are decoupled
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 5
What is cloud-native? Why does it matter?
This definition has to broadly apply to everyone, but not everyone has the
same capabilities. This is known as the lowest common denominator problem.
It is where you try and appeal to a broader group and their capabilities, but in
doing so you also need to limit the capabilities that can be leveraged.
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 6
Modern data management
for cloud-native development
As we cover the different capabilities your organization needs to acquire
to go fully cloud-native, it’s useful to view each one as a step in a journey.
The map below is a model for how organizations typically evolve their cloud-native
understanding. As your organization or team moves from stage to stage, capabilities
are gained that make releasing new features and functionality faster, better, and cheaper.
In the following sections, we’ll be focusing on the capability of Modern Data Management.
5. Modern Data
Management
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 7
A look back—traditional
three-tier application architecture
Let’s first take a look at the traditional architecture model that preceded
cloud-native data management.
1. A presentation layer
hosted by web servers
Web servers
Presentation layers
2. An application layer in
which business logic runs
Application servers
Business logic
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 8
Introducing modern
application architectures
Here is an example of a modern application architecture. Pro tip:
Yes, it does indeed resemble a three-tier web architecture,
but there are some very important differences. The best tool for a job usually
differs by use case so you should
build new applications with
Presentation
purpose-built databases.
How and where you store your
data is different in a modern
Events Events Business application. Each database is
APIs logic serving a specific need. This
will be covered in depth later.
Queues/messages
Data
The first major difference in this architecture is that it How AWS customers
doesn’t reflect the entire application—this is not a monolith,
use microservices
it’s a single microservice. The other key differences with this
architecture are related to:
AWS customers run hundreds or
Data. How and where you store your data is different in even thousands of microservices,
a modern application. In this diagram, we can see multiple an approach that greatly improves
data store options. Each database is serving a specific need, their scalability and fault tolerance.
meaning it’s purpose-built for the task at hand. Additionally, Usually, microservices communicate
data stores are decoupled. via well-defined APIs, and many
customers start the process of
Application integration and communication. Communication
refactoring by wrapping their
both within the application as well as between each service
applications with an API.
is different. In modern data architectures, there are
fundamentally different approaches to using messaging,
events, and APIs within the business.
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 9
Decoupling data along with business logic
When it comes to the data requirements of modular services, one size does not fit all.
Does the service need massive data volume? High-speed rendering? Data warehousing?
AWS customers are considering what they are doing with their data and choosing the
datastore that best fits that purpose.
Because the only database choice was a relational database—no matter the shape or
function of the data in the application—the data was modeled as relational for decades.
Instead of the use case driving the requirements for the database, it was the other way
around. The database was driving the data model for the application use case.
You’ll likely remember that one of the core concepts related to microservices is that each
service needs to own its own data. But why is this important? There are two really big
innovation benefits to following this approach.
2. The second really important point is that each data store can scale independently.
This means that one service may have a small database and another a very large
one, but each service can optimize its scale for performance and cost.
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 10
Purpose-built databases:
data models and use cases
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 11
Purpose-built
databases: NoSQL
So far, we’ve talked about modern applications and how these modern
applications achieve scale and high availability by using purpose-built
databases. Let’s now look at a specific family of databases called NoSQL.
NoSQL stands for Not only SQL. NoSQL databases are non-tabular and
store data differently than relational tables. NoSQL databases come in a
variety of types based on their data model. The main types are document,
key-value, wide-column, and graph. They provide flexible schemas and
scale easily with large amounts of data and high user loads.
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 12
Deeper into traditional three-tier application architectures
As we touched on earlier, developers were stuck with the single three-tier application architecture
—before the introduction of cloud technology. Understanding this previous limitation is helpful in
grasping the value of the cloud-native tools available today.
• Key-value access
Application servers
Business logic • Analytics (OLAP)
With the three-tier application architectures, developers were limited to a relational database
management system. And this database was responsible for everything!
Let’s look at some common operations that the single database was responsible for:
1. As companies started out, the database was 3. Once the company needed to write reports,
responsible for transactional create, read, update, the database was responsible for long-running
and delete (CRUD)-based queries like getUser analytical style queries that would impact
or updateOrderStatus. These type of operations available resources on the database while the
are called online transaction processing (OLTP). database was servicing customer requests.
2. As the companies’ systems grew, queries grew 4. If the team needed something as simple
to match the complexity of the systems and as a basic key-value datastore with
teams would join multiple tables and write hundreds of thousands of entries, the
queries with nested inner joins. database had to also handle this load.
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 13
SQL vs. NoSQL SQL - Optimized for storage NoSQL - Optimized for compute
Normalized/relational Denormalized/hierarchical
NoSQL was introduced Ad hoc queries Instantiated views
to address the limitations Scale vertically Scale horizontally
we learned about in the
previous section. Good for OLAP Built for OLTP at scale
At that time, developer teams didn’t need to When we talk about NoSQL, it’s important
think about the access patterns ahead of time. to understand that it’s not good for everything—
They normalized, reduced redundancy and it’s good for a certain class of applications. Those
storage, and queried data on the fly—but at applications have repeatable access patterns.
the cost of scale.
In case of a relational database, the ad-hoc
NoSQL was designed at the time when developers engine gives developers some flexibility.
focused on OLTP and storage was becoming If a developer doesn’t yet understand how
cheaper. Because storage was cheap, duplicating they are going to access data, then it could
data was suddenly not a bad thing after all. be very beneficial to have an ad-hoc query
engine, and that’s really suitable for an online
If a developer was designing for NoSQL, they had analytics processing (OLAP) type of workload.
to know their access patterns ahead of time. Access
patterns or query patterns define how the users
and the system access the data to satisfy business
needs. Using SQL means you don’t need to know
this access pattern ahead of time.
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 14
Product database
Normalized Denormalized
Products {
PRODUCT ID,
ID TYPE,
TYPE PRICE,
PRICE DESCRIPTION,
DESCRIPTION AUTHOR,
{
TITLE,
PRODUCT ID,
FICTION,
TYPE,
CATEGORY,
PRICE,
DATE,
Books Albums Videos DESCRIPTION,
…
ARTIST, {
}
ID ID ID TITLE PRODUCT ID,
AUTHOR TYPE TITLE GENRE, TYPE,
TITLE TITLE CATEGORY TRACKS:[{ PRICE,
FICTION GENRE FICTION TITLE 1, DESCRIPTION,
CATEGORY … PRODUCER DURATION 1 TITLE
DATE DIRECTOR },{ CATEGORY,
… … TITLE 2, FICTION,
TYPE DURATION 2 PRODUCER
PRICE }] DIRECTOR,
DESCRIPTION … ACTORS:[{
} ACTOR ID,
Tracks Actor/video NAME,
AGE,
ID ACTOR ID GENDER,
ALBUM ID VIDEO ID
SHORT BIO
TITLE
}…]
DURATION }
…
Actor
ID
NAME
AGE Here, we step into the world of aggregated items,
GENDER
BIO which are essentially prebuilt items. Rather than
…
running queries/joins across tables, these items
are written as they will be retrieved. This requires
understanding access patterns and writing the
SQL vs. NoSQL data appropriately, the benefit being nearly
design pattern limitless scale and consistent performance.
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 15
Pro tip:
Forget what you know
about relational databases!
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 16
Designing for the cloud means
thinking differently—your data
access patterns are important
When designing for NoSQL, • Access patterns that predetermine the way data
developers need to have the will be accessed. If this can’t be done up front,
following information up front: NoSQL may be the wrong approach for the project.
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 17
NoSQL use cases
Built for the cloud, NoSQL databases can scale nearly infinitely without
impacting performance. Additionally, data models for NoSQL in JSON are
natural for developers as the data is similar to code. This list of use cases
is not exhaustive, these are just a few examples that take advantage of
NoSQL’s scale and flexibility as a data store.
Here we see some typical use cases for NoSQL. Most of them take
advantage of the key benefits of using NoSQL: Having a flexible schema
that allows for changes and experimentation of the application.
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 18
Data democracy and silos
One of the key issues that data democracy addresses Marketing Manufacturing
But any size organization can end up with challenges related to data silos if they’re not intentional
about preventing them. Here are some ways to break down data silos and connect data assets:
• Data warehouses and data lakes can store • Culture changes that might consist of
large amounts of structured or unstructured instituting a data governance initiative
data in a repository or change management program
The cost of data silos depends on the organization but can impact businesses in terms of finances,
productivity, effectiveness, missed opportunities, and a lack of trust around data.
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 19
Data warehouse BI Reports
Data lake
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 20
Data lakehouse
BI Reports Data Machine
A data lakehouse brings in the best of both the data science learning
warehouse and the data lake, combining the reliability,
structure, and atomicity, consistency, isolation, and
durability (ACID) transactions of data warehouses
with the scalability and agility of data lakes.
Metadata and
governance Layer
A lakehouse enables business intelligence and
machine learning for all data. Lakehouses bring ETL Structured, semi-structured
Data lake
and unstructured data
different organizational personas together onto one
platform: Data scientists, data engineers, data teams,
business professionals, and software developers.
Vendor lock-in and disjointed tools hinder the ability to perform real-
time analytics that drive and democratize smarter business decisions.
Real-time
Lakehouses allow for rapid ingestion of all your data sources at scale decisions
to make better investment decisions, quickly detect new fraud patterns,
and bring real-time capabilities to risk management practices.
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 21
DevOps for databases
We just talked about data lakehouses and the capabilities that can give
your business an edge. Now we are going to shift focus and talk about
DevOps for databases.
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 22
Continuous Integration/Continuous Delivery
for databases
CI/CD for databases enables the rapid integration of database schema and logic changes into
application development efforts and provide immediate feedback to developers on any issues
that arise. Database CD produces releasable database code in short, incremental cycles.
CI/CD for databases leverages DevOps tools such as Jenkins, CircleCI, AWS CodePipeline,
and layers in database-specific tooling that allows for database changes to follow the same
development workflows that software engineering teams are already familiar with.
Database CI/CD
Deploy
YES
Develop Source CI tool
Pass? CD
database code + Artifact
Tool
code control
Dev Test Prod
NO
+
The database is a very critical system for most applications and needs
to be extremely resilient. Organizations need to have a clear process
and practice for database backup and disaster recovery. Mature
DevOps implementations practice gamedays and chaos engineering
to make sure their backup and DR strategies work as expected,
without causing any downtime.
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 23
Powerful
Consistent
version control
production-like
environments
Test data
Realistic
virtual data management Share
environments
in current state
Happy path testing—a type of software testing that uses known input and produces an expected
output—is straightforward and often does not require extensive amounts of data.
But for Continuous Integration, Continuous Delivery and Continuous Deployment to work effectively,
the tests need to be complete—verifying all scenarios and edge cases. This is not possible unless you
have access to production-scale data (of course keeping regulatory compliance and any other
organizational policies in mind).
A good example would be debugging a failed order: Was it due to a third-party payment gateway
being down or data schema change? Without complete visibility into a transaction, it is impossible
to debug and identify the root cause.
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 24
Key takeaways for Database DevOps
Let’s review some of the important concepts we’ve covered so far.
NoSQL does not mean non-relational 1. CI/CD for database changes using
tools like Liquibase
Relationships are accessed and modeled differently
than SQL. Developers need to model data differently 2. Backup and DR to ensure resiliency
in NoSQL, so forget what you know about RDBMS. and that applications are built to
Think about access patterns when modeling. NoSQL handle failures
can be a great fit for OLTP, where databases are read,
written, and updated frequently. 3. Test data management to achieving
continuous deployment where there
Data democracy is no manual intervention to deploy
to production
Data lakehouses provide the flexibility, cost-efficiency,
and scale of data lakes with the data management and 4. Observability into MELT data across
ACID transactions of data warehouses. Plus, lakehouses all systems, including databases for
are designed for most organizational personas, which complete root cause analysis while
further democratizes data. debugging issues
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 25
Implementing modern data
management processes and tools
In this section, we’ll look at some of the best-fit tools you could use to achieve the tenets discussed in
the previous section. At AWS, we’ve long been believers in enabling builders to use the right tool for the
job—and when you build with AWS, you’re provided with choice. You can build using the native services
AWS provides or use AWS Marketplace to acquire third-party software offered by AWS Partners to take
away the heavy lifting and allow your development teams to focus on delivering value to customers.
Let’s take a deeper look at three key components at this stage of your cloud-native journey: a way to get
data out of silos and democratize data for all personas in the company, a purpose-built database for OLTP
applications that can scale to meet the demands of customers, and end-to-end Observability of your
application and data workloads.
Amazon Amazon
Kinesis CloudWatch
Amazon
EventBridge
Amazon
EKS
Amazon
DynamoDB
AWS CodeCommit
AWS
AWS Device CodeDeploy
Farm
CodeCatalyst
AWS Cloud9 AWS
Lambd
a
AWS Marketplace is a cloud marketplace that makes it easy to find, try, and acquire the tools you need to
build cloud-native. More than 13,000 products from over 3,000 Independent Software Vendors are listed in AWS
Marketplace—many of which you can try for free and, if you decide to use, will be billed through your AWS account.
© 2023,
© 2023,
Amazon
Amazon
Web
Web
Services,
Services,
[Link].
or or
its its
affiliates.
affiliates.
AllAll
rights
rights
reserved.
reserved. 26
For our example architecture, we’ll use MongoDB for purpose-built databases, Databricks to
democratize data, and Elastic for end-to-end visibility. Here is an architectural diagram that
illustrates how these three components can be implemented.
VPC
DevOps pipeline
Dev
Engineers Git Repo
Build Test Release
Amazon S3 Amazon Aurora
Test DB changes
Fail – notify engineer Pass - deploy change
Test
Lakehouse Platform
Amazon S3 Amazon Aurora
MongoDB Atlas MongoDB Atlas provides the purpose-built database for OLTP, along
with Amazon Simple Storage Service (Amazon S3) for object storage,
and Amazon Aurora for online analytic processing.
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 27
More about how each component
plays a part in the architecture
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 28
How the solution works
The change triggers a DevOps pipeline that reads in the XML, creates the change set,
executes the change, and tests the database changes. If the change passes the tests,
the change is promoted, and the environment database is updated. If the change fails,
the database change is rolled back and a notification is sent to the engineer.
AWS Cloud
VPC
DevOps pipeline
Dev
Engineers Git Repo
Build Test Release
Amazon S3 Amazon Aurora
Test DB changes
Fail – notify engineer Pass - deploy change
Test
Prod
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 29
Cloud-native NoSQL with MongoDB Atlas
Endpoint service
MongoDB Atlas
cluster Node
VPC Subnet MongoDB Atlas
Subnet B
AWS Direct
Connect
MongoDB Atlas
cluster node
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 30
The document model is fundamentally different
MongoDB’s document model naturally maps to objects in code and eliminates requirements
to use object-relational mapping (ORMs). It breaks down complex interdependencies between
developer and database administration (DBA) teams.
With MongoDB, developers can represent the data of any structure, and each document
can contain different fields. The schema can be modified at any time. Additionally, MongoDB
is strongly typed for ease of processing with over 20 binary encoded JSON data types.
{
"_id" : ObjectId("5ad88534e3632e1a35a58d00"),
"name" : {
"first" : "John",
"last" : "Doe" },
"address" : [
{ "location" : "work",
"address" : {
"street" : "16 Hatfields",
"city" : "London",
"postal_code" : "SE1 8DJ"},
"geo" : { "type" : "Point", "coord" : [
-0.109081, 51.5065752]}},
+ {...}
],
"dob" : ISODate("1977-04-01T[Link]Z"),
"retirement_fund" :
NumberDecimal("1292815.75")
}
The example illustrated here shows the relational model on the left and a document model
on the right. Notice that all the information contained on the left is also on the right side,
which highlights the simplicity of having all information contained in a single JSON object.
It’s easy to read and a developer doesn’t have to understand the complex tabular relationship.
© 2023,
© 2023,
Amazon
Amazon
Web
Web
Services,
Services,
[Link].
or or
its its
affiliates.
affiliates.
AllAll
rights
rights
reserved.
reserved. 31
Flexible: Adapt to change
Perhaps the most powerful feature of document databases is the ability to nest objects inside
of documents. A good rule of thumb for structuring data in MongoDB is to prefer embedding
data inside documents as opposed to breaking it apart into separate collections. There are, of
course, some exceptions such as needing to store unbounded lists of items or needing to look
up objects directly without retrieving a parent document.
{ {
"_id" : ObjectId("5ad88534e3632e1a35a58d00"), "_id" : ObjectId("5ad88534e3632e1a35a58d00"),
"name" : { "name" : {
"first" : "John", "first" : "John",
"last" : "Doe" }, "last" : "Doe" },
"address" : [ "address" : [
{ "location" : "work", { "location" : "work",
"address" : { "address" : {
"street" : "16 Hatfields", "street" : "16 Hatfields",
"city" : "London", "city" : "London",
"postal_code" : "SE1 8DJ"}, "postal_code" : "SE1 8DJ"},
"geo" : { "type" : "Point", "coord" : [ "geo" : { "type" : "Point", "coord" : [
-0.109081, 51.5065752]}}, -0.109081, 51.5065752]}},
+ {...} +
], ],
"dob" : ISODate("1977-04-01T[Link]Z"), "phone" : [
"retirement_fund" : { "location" : "work",
NumberDecimal("1292815.75") "number" : "+44-1234567890"},
+
} ],
"dob" : ISODate("1977-04-01T[Link]Z"),
"retirement_fund" :
NumberDecimal("1292815.75")
}
Note in this example that the name field is a nested object containing both given and family
name components, and that the address field stores an array containing multiple addresses.
Each address can have different fields in it, which makes it easy to store different types of data.
© 2023,
© 2023,
Amazon
Amazon
Web
Web
Services,
Services,
[Link].
or or
its its
affiliates.
affiliates.
AllAll
rights
rights
reserved.
reserved. 32
MongoDB for Microsoft Visual
Studio (Microsoft VS) code
• Explore your MongoDB data
MongoDB also provides an extension for Microsoft • Prototype queries and
VS code, which lets you work with MongoDB and run MongoDB commands
your data directly within your coding environment. • Create a Shared Tier Atlas cluster
You can use MongoDB for Visual Studio Code to: using a Terraform template.
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 33
Databricks Lakehouse Platform
Try it in AWS Marketplace › The Databricks Lakehouse Platform combines the best elements
of data lakes and data warehouses to deliver the reliability,
Start Databricks on AWS training ›
strong governance, and performance of data warehouses
with the openness, flexibility, and ML support of data lakes.
AWS Cloud
VPC
Dev
Test
Data Business users IT Data warehouse Data service
scientists users users users
Prod
This unified approach simplifies the modern data stack by eliminating data silos that traditionally
separate and complicate data engineering, analytics, BI, data science, and machine learning. It’s built
on open source and open standards to maximize flexibility. Additionally, its common approach to data
management, security, and governance helps developers operate more efficiently and innovate faster.
In our solution, Databricks integrates with the data plane in a customer’s account for access to
MongoDB, Amazon S3, and Amazon Aurora database services. The control plane in the Databricks
cloud environment allows for uniform access for different personas in the organization via https,
JDBC, ODBC, or REST APIs.
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 34
Key features of Databricks Lakehouse Platform
Enables discovery of data and other artifacts like code and ML models.
Organizations can assign different administrators to different parts of the
catalog to decentralize control and management of data assets. This approach
—a centralized catalog with federated control—preserves the independence
and agility of the local domain-specific teams while ensuring data asset reuse
across these teams and enforcing a common security and governance model.
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 35
Ingesting logs No solution is complete without end-to-end visibility of your application
and data stack. With the complexity of modern environments, tracing
and metrics issues or performance problems across microservices that extend into
other Availability Zones, Regions, accounts, and even hybrid clouds,
is difficult without a tool to aggregate all this information.
Watch a demo and learn more › There are over 23 out-of-the-box integrations for AWS products
and services that allow MELT data to be sent into the Elastic Cloud.
Start a hands-on lab ›
Elastic supports cloud-native technologies such as Kubernetes,
AWS Lambda, DynamoDB, and many others.
Kibana
Explore, visualize, engage
Elasticsearch
Store, search, analyze
Integrations
Connect, collect, alert
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 36
Key benefits of Elastic Cloud Observability
Get up and running in minutes. This not only includes setup of the
Elastic Cloud services, but also ingesting data from AWS accounts and
hybrid cloud accounts.
Make sure your deployment is secure from the ground up, including but
not limited to OS hardening, network security controls, and encryption
for data in motion as well as data at rest.
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 37
Recap: Modern data architecture
To recap the solution:
The capabilities of MongoDB are leveraged for Elastic Cloud Observability brings
NoSQL document storage that can scale to handle together MELT data into one place
performance and availability requirements of the to visualize.
most demanding cloud-native applications.
An important final point is that all of these services are SaaS offerings on AWS that
abstract away the heaving lifting of managing and maintaining servers and infrastructure,
helping developers focus on creating new features to drive increased customer satisfaction.
VPC
DevOps pipeline
Dev
Engineers Git Repo
Build Test Release
Amazon S3 Amazon Aurora
Test DB changes
Fail – notify engineer Pass - deploy change
Test
Lakehouse Platform
Amazon S3 Amazon Aurora
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 38
Implement your modern data
management strategy today
Databricks Lakehouse Platform, MongoDB Atlas, and Elastic Cloud can be used together along with AWS
to build the foundation for establishing a well-engineered approach to modern data architecture. Establishing
these capabilities can provide a strong foundation for continuing to advance through your cloud-native journey.
You can explore these and other DevOps and modern data management-focused tools in AWS Marketplace.
Part of the reason for this is that AWS • AWS Control Tower
• AWS Service Catalog
Marketplace is supported by a team of solution
• AWS CloudFormation (Infrastructure as Code)
architects, security experts, product specialists,
• Software as a Service (SaaS)
and other experts to help you connect with the • Amazon Machine Image (AMI)
software and resources you need to succeed • Amazon Elastic Container Service (ECS)
with your applications running on AWS. • Amazon Elastic Kubernetes Service (EKS)
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 39
Get started today
Visit [Link] to find, try and
buy software with flexible pricing and multiple deployment
options to support your use case.
[Link]
Authors:
James Bland
Global Tech Lead for DevOps, AWS
Aditya Muppavarapu
Global Segment Leader for DevOps, AWS
© 2023, Amazon Web Services, Inc. or its affiliates. All rights reserved. 40