Blog

Splice Machine Blog

Make the Elephant Fly at Strata + Hadoop World  /  2.18.15 

Join us at booth 1019 for a free t-shirt and a chance to win one of three $100 gift cards Transactions were long thought of as something out of scope for Hadoop, but that is changing due to Hadoop’s unique ecosystem. Now, as Hadoop has come to be the de facto standard in the Big Data space, there’s...

Read More

Making Data Lakes Real-Time with Transactional Hadoop  /  2.12.15 

The phrase “data lake” has become popular for describing the moving of data into Hadoop to create a repository for large quantities of structured and unstructured data in native formats. Yet, many are unfamiliar with the concept of an operational data lake, which enables the structured content of...

Read More

Splice Machine is now HDP Certified  /  1.27.15 

Splice Machine makes Hadoop transactional to power real-time applications Built on top of the HDFS and Apache HBase components in the Hortonworks Data Platform (HDP), Splice Machine is delighted to announce that it has completed the required integration testing with HDP. Customers can now take full...

Read More

Scale Up vs. Scale Out: A Brief Guide to Scaling Oracle Databases  /  1.21.15 

Organizations are now looking for ways to handle exploding data volumes while reducing costs and maintaining performance. Managing large volumes and achieving high levels of concurrency on traditional scale up databases, such as Oracle, often means purchasing expensive scale-up hardware. Splice Machine...

Read More

Webinar: Crawl, Walk, Run: How to Get Started with Hadoop  /  1.15.15 

Please register now to join Splice Machine on Tuesday, January 20th at 1pm PST/4pm EST for a briefing with William McKnight of the McKnight Consulting Group and Splice Machine’s own VP of Product Management and Marketing, Rich Reimer. During this webinar, you will learn how Hadoop is transitioning...

Read More