AWS re:Invent 2022: The big news

Published on the 01/12/2022 | Written by Heather Wright


A

And some local announcements…

AWS re:Invent took over Las Vegas again this week, with the company announcing a host of new launches. 

Adam Selipsky, AWS chief executive, kicked off the event with a keynote which saw him noting the increasing business challenges customers are facing, and the opportunities they present for the company. 

“When it comes to the cloud, many of our customers know that they should be leaning in precisely because of the economic uncertainty, not despite it,” says Selipsky, who took over from Andy Jassy as AWS boss last year. 

AWS’ vision of a zero ETL future will mean data integration is no longer a manual effort.

Among those issues are of course, the ongoing challenges with supply chains. AWS’ solution is AWS Supply Chain, an application to help businesses increase supply chain visibility in real time, combining and analysing data across multiple systems to enable more accurate demand forecasting, and providing machine learning-enabled predictions to forecast risks.  

A ‘zero ETL future’ 

Selipsky also used the event, which he says has more than 50,000 customers and partners in attendance and a further 300,000 attending virtually,  to promise a ‘zero ETL future’.  

Getting data from multiple sources into a central repository has always been a challenge and AWS has declared its vision of a future where there’s no need to extract, transform and load data from source to data warehouse for analytics. 

The cloud giant introduced Aurora zero-ETL integration with Amazon Redshift, enabling customers using the Aurora database and Redshift data warehouse to move data without having to perform ETL.  

AWS claims transactional data written into Aurora is available ‘within seconds’ in Redshift, enabling data to be analysed from multiple Aurora database clusters. 

Selipsky says AWS’ vision of a zero ETL future will mean data integration is no longer a manual effort.  

In a similar vein, he also announced an Amazon Redshift integration with the open source big data processing platform, Apache Spark, enabling customers to run Spark applications on Redshift data using AWS analytics and machine learning services such as EMR, Glue and Sagemaker. 

Serverless OpenSearch, Security Lakes, DataZones and SimSpace 

Among the other announcements were a serverless version of OpenSearch, enabling developers to run petabyte-scale workloads without configuring, managing and scaling OpenSearch clusters, and Amazon Security Lake which automatically centralises security data from cloud, on-premises and custom sources into a purpose-built data lake stored in your account. Both are in preview now. 

The new data management service, DataZone, meanwhile is designed to help enterprises catalogue, discover, share and govern their data across AWS, on-prem and third-party sources. 

AWS is using machine learning to help customers build the data catalogues and generate the metadata needed to make them searchable.  

“To unlock the full power, the full value of data, we need to make it easy for the right people and applications to find, access and share the right data when they need it – and to keep data safe and secure,” Selipsky says.  

“Good governance is the foundation that makes data accessible to the entire organisation, but we often hear from customers that it is difficult to strike the right balance between making data discoverable and maintaining control,” says Swami Sivasubramanian, AWS vice president of databases, analytics, and machine learning.  

“With Amazon DataZone, customers can use a single service that balances strong governance controls with streamlined access to make it easy to find, organise and collaborate with data. Amazon DataZone sets data free across the organisation, so every employee can help drive new insights to maximise its value.”

Also launched was SimSpace Weaver, a fully managed compute service to help customers build, operate and run large-scale spatial simulations, without being contained by hardware or managing infrastructure.  

Developers can use their own custom simulation engine or popular 3D engines such as Unity and Unreal. 

NAB and local startup deals 

There’s local news to be had as well, with NAB announcing it has signed a multi-million dollar long-term deal with AWS, expanding on previous work between the two companies. The deal includes the ‘acceleration’ of the migration of key critical workloads to AWS and will see NAB adopting innovations including Graviton processors to improve efficiency and sustainability in the cloud.  

The AWS Clean Energy Accelerator 3.0 was also launched, with AWS noting it is open to startups in Australia and New Zealand.  

John Kearney, AWS Australia and New Zealand head of startups, says the non-equity dilutive accelerator program supports clean energy technology innovation with mentorship and co-innovation engagement from AWS, making it well suited to mature startups. 

Meanwhile a new game-based training initiative, AWS Industry Quest got the thumbs up from NAB chief technology officer, Steve Day. 

The first iteration of Industry Quest is designed for the financial services sector, with NAB the first beta customer globally.  

Day says the program provides an ‘innovative and humanistic way of learning and strengthening cloud skills’. 

Digital skills are the skills of the future, and, as one of the biggest employers of technology talent in the Southern Hemisphere, we need to ensure our people are trained incredibly well in this area so they and NAB can thrive.”

Post a comment or question...

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

MORE NEWS:

Processing...
Thank you! Your subscription has been confirmed. You'll hear from us soon.
Follow iStart to keep up to date with the latest news and views...
ErrorHere