This article is more than 1 year old

Live analytics without vendor lock-in? It's more likely than you think, says Redis Labs

'AI serving platform' runs in database but isn't tied to specific cloud service

In February, Oracle slung out a data science platform that integrated real-time analytics with its databases. That's all well and good if developers are OK with the stack having a distinctly Big Red hue, but maybe they want choice.

This week, Redis Labs came up with something for users looking for help with the performance of real-time analytics – of the kind used for fraud detection or stopping IoT-monitored engineering going kaput – without necessarily locking them into a single database, cloud platform or application vendor.

Redis Labs, which backs the open-source in-memory Redis database, has built what it calls an "AI serving platform" in collaboration with AI specialist Tensorwerk.

RedisAI includes deploying the model, running the inferencing and performance monitoring within the database bringing analytics closer to the data, and improving performance, according to Redis Labs.

Bryan Betts, principal analyst with Freeform Dynamics, told us the product was aimed at a class of AI apps where you need to constantly monitor and retrain the AI engine as it works.

"Normally you have both a compute server and a database at the back end, with training data moving to and fro between them," he said. "What Redis and Tensorwerk have done is to build the AI computation ability that you need to do the retraining right into the database. This should cut out a stack of latency – at least for those applications that fit its profile, which won't be all of them."

Betts said other databases might do the same, but developers would have to commit to specific AI technology. To accept that lock-in, they would need to be convinced the performance advantages outweigh the loss of the flexibility to choose the "best" AI engine and database separately.

IDC senior research analyst Jack Vernon told us the Redis approach was similar to that of Oracle's data science platform, where the models sit and run in the database.

"On Oracle's side, though, that seems to be tied to their cloud," he said. "That could be the real differentiating thing here: it seems like you can run Redis however you like. You're not going to be tied to a particular cloud infrastructure provider, unlike a lot of the other AI data science platforms out there."

SAP, too, offers real-time analytics on its in-memory HANA database, but users can expect to be wedded to its technologies, which include the Leonardo analytics platform.

Redis Labs said the AI serving platform would give developers the freedom to choose their own AI back end, including PyTorch and TensorFlow. It works in combination with RedisGears, a serverless programmable engine that supports transaction, batch, and event-driven operations as a single data service and integrates with application databases such as Oracle, MySQL, SQLServer, Snowflake or Cassandra.

Yiftach Shoolman, founder and CTO at Redis Labs, said that while researchers worked on improving the chipset to boost AI performance, this was not necessarily the source of the bottleneck.

"We found that in many cases, it takes longer to collect the data and process it before you feed it to your AI engine than the inferences itself takes. Even if you improve your inferencing engine by an order of magnitude, because there is a new chipset in the market, it doesn't really affect the end-to-end inferencing time."

Analyst firm Gartner sees increasing interest in AI ops environments over the next four years to improve the production phase of the process. In the paper "Predicts 2020: Artificial Intelligence Core Technologies", it says: "Getting AI into production requires IT leaders to complement DataOps and ModelOps with infrastructures that enable end-users to embed trained models into streaming-data infrastructures to deliver continuous near-real-time predictions."

Vendors across the board are in an arms race to help users "industrialise" AI and machine learning – that is to take it from a predictive model that tells you something really "cool" to something that is reliable, quick, cheap and easy to deploy. Google, AWS and Azure are all in the race along with smaller vendors such as H2O.ai and established behemoths like IBM.

While big banks like Citi are already some way down the road, vendors are gearing up to support the rest of the pack. Users should question who they want to be wedded to, and what the alternatives are. ®

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like