🚀 Watch Launch Point On-Demand: Explore the latest Starburst innovations powering next-gen data apps and AI.

Dell and Starburst continue to push the boundaries of AI

  • Steven Foster

    Steven Foster

    Cloud Alliances Manager

    Starburst

  • Toni Adams

    Toni Adams

    SVP Partner and Alliances

    Starburst

With Dell Tech World 2025 in the rearview mirror, one thing is clear. Dell is redefining the future of enterprise AI, and Starburst is proud to be part of that ongoing journey.

Starburst and Dell share a vision: enabling organizations to access, prepare, and govern data at scale to power the next generation of analytics and AI. Most importantly, this access is universal, allowing organizations to access data wherever it resides, whether that be on-premises, in the cloud, cross-cloud, or a combination of all. 

At the heart of value for customers is the Dell Data Lakehouse, part of the wider Dell AI Data Platform–a modern data architecture designed to meet the evolving needs of the AI era. With capabilities such as native vector search, built-in large language model (LLM) functions, Retrieval-Augmented Generation (RAG) support, and automated Apache Iceberg table management, Dell is providing enterprises with the tools they need to turn data into action – today!

Together, Starburst and Dell are helping customers build agentic AI strategies that work across complex, distributed data environments.

Image depicting Dell Tech World 2025 presentations and the Dell AI Data Platform in particular.

What was new at Dell Tech World 2025?

The Dell Data Lakehouse – with its Dell Data Analytics Engine, powered by Starburst – is evolving to meet the demands of enterprise AI. New capabilities help organizations simplify infrastructure, accelerate time to insight, and make AI more accessible across teams.

As part of this, Dell announced the following features: 

  • Native Vector Embedding Creation and Search: This feature enables seamless, centralized AI workflows when creating and querying AI-ready datasets within the lakehouse. This streamlines workflows for use cases like recommendation systems, semantic search, and customer behavior analysis.
  • Built-in LLM Functions: These functions democratize AI access, enabling business and analytics users to perform tasks like content extraction and sentiment analysis through SQL, expanding AI access without requiring coding skills.
  • Hybrid Search: This feature builds on the vector search capability. Using a single SQL query, it combines semantic similarity with traditional keyword matching. The approach optimizes for both contextual relevance and precision and can be used to power various use cases, including enterprise search, chatbot retrieval, or metadata exploration. 
  • Automated Iceberg Maintenance: This feature optimizes large-scale data workflows by automating maintenance tasks such as compaction and snapshot expiration. It streamlines performance at scale, reducing operational overhead and improving data readiness.

 

Dell Data Lakehouse: The Data Engine of the Dell AI Data Platform

The Dell AI Data Platform is designed to manage complex data workflows at scale, allowing customers to place, process, and protect their data. Within this context, the Dell Data Lakehouse plays a foundational role in enabling AI-driven outcomes at enterprise scale. While powerful on its own, its real strength lies in deep integration with technologies like Dell PowerScale, creating an end-to-end ecosystem for AI-ready data.

Let’s look at these in more detail. 

Dell PowerScale

PowerScale offers a highly secure and flexible unified file and object platform. It is tailored for AI data preparation, training, and inferencing workloads. To help achieve this, it is NVIDIA Cloud Partner certified and is equipped with highly advanced data services. This approach helps simplify data ingestion for RAG pipelines and MetadataIQ, allowing for the rapid discovery of data with advanced search capabilities.

NVIDIA RAPIDS and Spark 

The Dell AI Data Platform with NVIDIA also announced support for Apache Spark using the NVIDIA RAPIDS Accelerator. This brings GPU power directly to lakehouse data pipelines, dramatically boosting performance for large-scale AI and analytics workflows, especially agentic and retrieval-augmented generation (RAG) use cases. This approach accelerates ETL and ML pipelines, helping to reduce adoption bottlenecks and lower infrastructure costs. Early benchmarks show up to 90% faster query performance and a 53% reduction in total cost of ownership, delivering scalable, cost-effective AI infrastructure for industries like finance, retail, and manufacturing.

 

Scaling AI Workloads with Advanced Infrastructure

To meet the demands of large-scale AI deployments, Dell has introduced significant enhancements to its storage solutions. These advancements are designed to provide the performance, scalability, and efficiency required for complex AI workloads.​

Project Lightning Performance Testing

Project Lightning showcased advanced performance with the world’s fastest parallel file system, delivering up to 2x greater throughput than competing systems. Lightning is purpose-built to accelerate training and inferencing times for large-scale and complex AI deployments with tens of thousands of GPUs.

Dell ObjectScale Innovations

Dell has pre-announced S3 over RDMA support for ObjectScale, which significantly improves data transfer speeds and reduces latency for AI training and inference. The new ObjectScale systems, including the XF960 and X560, offer increased storage density and performance, enabling organizations to efficiently handle multi-petabyte-scale AI workloads.​

Advancing ObjectScale with further integration with NVIDIA BlueField-3 DPUs

In addition, Dell has announced the planned integration of NVIDIA’s BlueField-3 DPUs and Spectrum-4 Ethernet switches into an ultra-dense, software-defined configuration of ObjectScale. These networking components provide up to 800 Gb/s connectivity, ensuring rapid data movement across the AI data pipeline and reducing bottlenecks in AI processing tasks.​

These infrastructure advancements, combined with the capabilities of the Dell Data Lakehouse powered by Starburst, provide a robust foundation for enterprises to scale their AI initiatives effectively.

 

Powering the Future of Enterprise AI, Together

At Starburst, we believe the future of enterprise AI begins with easy, universal access to data, wherever it resides. 

One way that this future is already taking shape is with our partner Dell. The latest enhancements to the Dell Data Lakehouse reflect a shared vision: enabling organizations to move faster, scale smarter, and unlock the full value of their data for analytics, data applications, and AI.

Starburst is here to help you access your data where it lives, whether that means modernizing legacy systems, delivering data applications, or navigating strict compliance environments.

Ready to accelerate your AI strategy? Get started with Starburst and Dell.Â