Splice Machine Blog

Splice Machine Blog

9.14.18  /  How to Measure an HTAP Data Platform for AI Applications

AI Is Mission Critical Every company’s board is asking its executive team how they are using AI to digitally transform the business. AI has become a game changing event. Data scientists are now a mainstay of the corporate analyst landscape, looking for that next actionable insight for the business....

Read More

9.12.18  /  Join Webinar: Accelerate ROI on Your Data Lake Investment

9 out of 10 large organizations have started on their data lake journey, but many are struggling to generate value out of their huge investments. Data lake initiatives either remain incomplete due to 1) the lack of expertise and specialized skills around Hadoop or 2) the failure to scale the highly specialized...

Read More

9.12.18  /  Mastered Data Lakes: How You Can Simplify Data Pipelines and Run Mission-Critical Applications at Scale

In the fast-paced, competitive environment of real estate, data can be a company’s most valuable asset. This is certainly true for Ten-X, the largest online real estate market in the U.S. Over the past eleven years, the Company has had $53 billion in property sales and holds the Guinness World Record...

Read More

8.24.18  /  Watch Webinar: The Mastered Data Lake – How Ten-X Performs Modern MDM on Hadoop

Splice Machine's webinar with TenX on building a mastered data lake for your organization is now available to watch on-demand. For Ten-X, the leading online real estate marketplace, integrating data at scale with MDM is critical. They needed to deliver business insight by combining unstructured clickstream...

Read More

8.22.18  /  Making Transactional SQL on HBase Scale

If you haven't tried building a scalable, relational database that can run mixed transactional and analytical workloads, the Splice Machine team has a word of advice: it's hard. At HBaseCon 2018, Splice Machine co-founder and CEO, Monte Zweben, took the audience through the company's journey in scaling...

Read More