“I grew up in the era when it was assumed if you were buying a computer, you wanted to program it,” said Rice University Computer Science alumnus Doug Daniels (B.A. ’02).
Although he was comfortable tinkering with his family’s Tandy and subsequent computers, Daniels expected to study human behavior—through literature and anthropology—in college.
“I was drawn to Rice by its environment, weather, and people. It seemed like a good place to figure out why the world works the way it does. But I discovered I wasn’t having fun writing longer and longer papers about abstract ideas, and the end results were not as satisfying as the finite answers I’d arrived at in my high school math classes.”
After two years at Rice, Daniels talked to Corky Cartwright, a Rice CS professor, about changing directions. He enrolled in his first Rice CS class as a junior.
“COMP 201 was a very difficult introductory course taught with Scheme. It took me over 10 hours to complete my first project because I hadn’t programmed in years. But it didn’t feel like work. Time seemed to breeze by, especially compared to the hours spent writing papers. This work was challenging and interesting—plus, I could totally do this as a job later and that was magical.
“Changing majors mid-way through my junior year meant I had to quickly make up a lot of time. But I had met a lab partner who was also coming into CS late. K Young was switching to CS from Archi and Mechanical Engineering after getting excited about nano-tech and coding. We were taking all the same classes and did all our projects together.”
After graduation, Daniels and Young parted ways to start their careers. Daniels accepted a Houston job and Young headed to New York. But the friends would soon reunite, first as co-workers, then as business partners, and ultimately in leadership roles at Datadog, a fast-growing software-as-a-service (SaaS) company.
“We’ve worked together for over twenty years,” said Daniels. “That’s typical for Rice. What you accomplish at Rice and the people you meet there will be amplified the rest of your life.”
In New York, Young had begun work at an educational technology company. Daniels joined him there and immediately enjoyed the challenge of developing software for teachers to conduct reading and math assessments on Palm Pilots.
Daniels said, “The company was really successful, and their database covered about a fifth of the students in the United States. With that many records, we were no longer an education company, we were a data company.”
With such a large amount of data to handle, the company began to face increasingly challenging data integration problems. The company could painstakingly cobble together the streams of data in-house or purchase expensive data integration tools that required huge amounts of computing power, which presented additional operational challenges. Unsatisfied with their options, Daniels and Young began exploring new ways of processing massive amounts of data, including a new data-processing framework called Hadoop.
“Hadoop was released as an open source implementation of MapReduce around 2006, and Amazon Web Services opened up the cloud around the same time. With Hadoop, you could use commodity hardware to work on large amounts of data instead of investing in expensive, specialized computing hardware. With the cloud, all you needed was an API to spin up that hardware,” said Daniels.
Daniels and Young quickly recognized an opportunity—not just for their own company, but for anyone with huge troves of data to be analyzed. “Together, Hadoop and the cloud gave everyone the ability to access a large fleet of servers and process huge data sets,” Daniels said. “But not everyone knew how to tap into that architecture. How could we make a toolkit for any engineer in any company to process their own data in the cloud?”
The two engineers struck out on their own to found a new company, Mortar Data, dedicated to putting these powerful new data technologies in the hands of any engineer who could benefit from them. “Most companies working with Hadoop were large organizations like Yahoo and Microsoft,” Daniels said. “They could dedicate engineers to code in Hadoop itself, but we wanted to bring it to everyone else.”
At Mortar, the co-founders built an open source toolkit and a hosted platform that made it much easier to run Hadoop jobs in the cloud. Daniels said their toolkit could run jobs on thousands of computers without requiring their customers to spend time learning and understanding the details of their network and infrastructure.
“The platform we built allowed software engineers to run Hadoop after installing our toolkit on their Windows and Mac laptops,” Daniels said. “We also created tutorials and documentation for people who didn’t want to learn Hadoop. They just needed to be able to enter the environment, write code, and run their jobs—all of which they could do without the support of a large engineering team.”
Mortar Data’s platform caught the attention of several SaaS companies with interesting data challenges of their own. Datadog, which provides a monitoring and analytics platform for large-scale applications and infrastructure, acquired Mortar Data in 2015 and onboarded the two co-founders as well as their employees.
As Datadog grew from a startup to an established tech company, Daniels and Young were both promoted to become VPs. “It’s been hyper growth,” said Daniels. “Datadog had 70 people when we joined them four years ago, and now it’s over 1,000. And our platform has been evolving just as rapidly. As more and more organizations move to the cloud, more people need to know how their services and infrastructure are performing.
“Datadog offers full-stack observability. Infrastructure metrics, logs and traces from the programs you are running, and everything going on in your production environment at high scale comes into your Datadog dashboard, so you can see it all displayed in one place. Thousands of companies now rely on our platform, but what’s really important is that our customers are other developers.”
Software developers may not be able to envision how their code will run at scale, or where their applications may have bottlenecks. Datadog collates this information, generates automated alerts, and highlights anomalies for their customers to review.
Daniels said, “If you think about all the electronic and mechanical components in your car, you can imagine how hard it would be to manually inspect them all every time you go for a drive to pinpoint areas of concern. So cars have triggers that notify you when some component or system needs attention. Now imagine how hard it would be to monitor all those components across a huge fleet of cars that are constantly being modified, added to, and driven in unexpected ways. That’s the challenge of running a modern web application, and that’s what we do for our customers. We’re like a dashboard, a check-engine light, and a really good diagnostician for the Internet.”