Bicycle for Data

Building a bicycle for data


David Kell

Jan. 17, 2020 • 4 min

Your brain is a big data processing engine.

It’s not the sexiest way to imagine ourselves, but it’s one way to think about intelligence. What I mean is:

  • You collect data. You have millions of sensors all over your body, including your eyes, ears and nose.
  • You build a model. Your brain synthesises the data to build a model of the world outside your head. It does this on gigabytes of data, in real-time.
  • You make a decision. You decide on your next course of action by studying this model of the world, against your objectives and priorities.

The process of collecting data, analysing it and making decisions is a core part of what makes us human.

A million eyes

Looking around us today, the planet is covered with billions of internet-enabled devices, collecting data in real-time. Websites, credit cards, GPS, social media, satellites, AVs, drones. It’s like we have a million eyes, ears and noses.

In theory, we should all be super knowledgeable about what is happening globally. In reality, most of our information is coming via experts - news articles, opinion pieces, wikipedia entries. (And unfortunately, you can always find an expert to support any opinion you want!)

Somehow, the world is filled with a lot more data. But our brain is not able to turn that into a better model of the world. How do we change this?

Where is the bicycle for data?

Steve Jobs famously described the computer as a “bicycle for the mind”. The idea is that just as a bicycle amplifies the power of our muscles, a computer amplifies the power of our brain.

Writing code is an early example of this kind of brain amplification through technology. People used to use computers by directly inputting machine instructions on punch cards. With code, programmers could express the program as something that was closer to a story. This was a radical notion in the 1950s!

Fast forward to 2020, and we generally think of coding as a highly technical skill. We’ve seen countless revolutions in what is now called “human computer interaction”, and our tools have evolved to become extensions of our selves. We’re at a point where we can rent someone’s house by pressing our thumbs on a smartphone.

Except for data. We still need code to turn this data into a model of what is happening.

Imagine what would happen if we could synthesise this data, as effortlessly as using a smartphone, without writing any code?

Building a no-code data tool

Every technology needs to have an interface. It’s the medium through which we communicate with the computer and get feedback - like the word document, spreadsheet or message thread.

Existing data science vendors would like you to believe that the interface for data science is a workflow builder, or a BI tool, a dashboard or SQL. But the data scientists themselves have made a clear choice, and that is the notebook.

Update from 24/10/21: While we got most of this right, we learned the hard way that the notebook is not the right interface for no-code data analytics. But everything else here remains true!

Why notebooks? Because data science is about telling a story - it’s the story your data tells you about the world. And notebooks enable you to tell this story, in an interactive document that combines code, visualisations and text.

And the best thing about notebooks is that they democratise. Anyone can write a notebook. You don’t have to ask permission.

But awesome as they are, notebooks are still fundamentally about coding.

And that’s why we’ve built Gyana, the no-code notebook. It’s a bicycle for data science, designed to be as easy to use as a word document.

Here’s a preview of what is possible with Gyana:

  • Interactive documents, to tell the story of your data.
  • The full power of SQL, R and python without coding.
  • Instant feedback, powered by the latest hardware in our cloud.
  • Seamlessly connect to your data from any source (files, databases, external apps, APIs).
  • Host your data for anyone to use.
  • Collaborate in real-time and share your notebooks.
  • In-built connections to external services, for NLP, ML and AI.
  • Scale up from GBs to PBs, without managing any infrastructure.
  • Turn your notebooks into templates for other people to use.
  • Automatic recommendations based on usage inside and outside your organisation.

With no-code data, we're hoping that it becomes normal for all of us to based on decisions on real data in the world.

Whether you are running a business, non-profit or just making sense of a political issue - you, and anyone you work with, can have effortless access to an accurate model of the thing you care about.