Amazon Web Services (AWS) made it clear this week that for companies large or small, the biggest asset is data. It's something AWS says it will focus on heading into 2020.
The cloud giant's CEO Andy Jassy spent his re:Invent keynote running through announcement after announcement centred on what an organisation could achieve with data. While clean data is good for customers, AWS is betting that data movement between storage, database, and analytics workflows[1] will be the secret sauce for gaining more workloads from legacy players.
"To have the ability to move data from data store to data store is transformational," Jassy said.
Jassy said that just in the last few years, more than 350,000 databases have migrated to AWS using its database migration service or schema conversion tool. But what has been missing, until now, is the ability to work around the application code that's tied to that proprietary database.
"Especially as they've watched Microsoft get more punitive, more constrained, and more aggressive with their licensing," Jassy said.
AWS is open-sourcing Babelfish for PostgreSQL[2], an Apache-licensed project that acts as a new translation layer for PostgreSQL. This means that PostgreSQL can now understand commands from applications written for Microsoft SQL Server without the developer having to change database schema, libraries, or SQL statements.
Read more: Amazon just open sourced an easier path to PostgreSQL[3] (TechRepublic)
The new translation capability will see more workloads hosted on the AWS cloud.
"Customers that we've spoken to privately about this, to say that they're excited would be one of the more glorious understatements I could make," he said. "People really want the freedom to move away from these proprietary databases."
Jassy said his company has a different way of thinking about its business