Question startup culture before accepting a data-to-AI role

an illustration of the data 'garbage in, garbage out' concept

AI shmAI. If you’ve been paying any attention, you’d know that for the vast majority of AI/ML projects, the real value comes from the data.

And if you’re considering a role with a startup that has grand AI plans, you’d better learn about the startup’s work culture – which includes its data culture.

To help you, this post discusses the questions from the Culture section of my Data-to-AI Health Check for Startups. Let’s jump right into it.

Q1: How often are people expected to work outside normal business hours (founders included)? Is unreasonable overtime compensated? “It’s a startup” isn’t a valid excuse for constant overwork. Most successful startups require a sustained effort over many years, and working at capacity reduces productivity over time. That said, high-effort spikes are inevitable – but unusual efforts should be recognised and compensated.

Q2: Do people go on leave regularly (founders included)? This probes a similar cultural norm around overwork to Q1. Stay away from places where people never go on leave. It leads to burnout and collective stupidity: Knowledge workers need downtime to take a step back and come up with new creative ideas. Humans are not AIs.

Q3: How do employees view the leadership team and founders? A small startup won’t have significant quantitative data on employee views. Even at larger companies, employee surveys are often designed and administered in a way that masks problems. If you’re considering a role with a startup, ask to speak with current employees to learn about their views on the culture, founders, and the company’s prospects. If the company is established, sites like Glassdoor and Blind can help you probe issues beyond the current employee base. In any case, remember that there’s an inherent selection bias in only questioning current staff, which is why it’s worth learning about former employees and ex-founders.

Q4: How are wins celebrated? How are failures and mistakes analysed? The unfortunate reality is that many startup founders have little experience running or working at a startup. Therefore, they may not appreciate the need to celebrate wins or to take time to learn from failures. Rather than asking about general rituals, you could ask for examples: What did you do after the last big release? What did you learn from the latest outage? How will you mitigate similar outages?

Q5: How is excellent/poor individual performance evaluated and handled? If you’ve worked anywhere at any capacity, you’d know that the following is true: (1) underappreciated excellent employees may leave; and (2) poor performers may drag an entire team down. Before you join a growing startup, it pays to know that founders have put some thought into performance management – especially if excellence is one of your core values.

Q6: Does the company run data-informed experiments (like A/B tests)? If so, what is considered a successful experiment? For example, what happens if a well-run experiment produces results that contradict the CEO’s opinion? Finally, a question that directly addresses data culture! If you are considering a data role, a culture of intelligent experimentation is a positive sign that the startup is right for you. The correct definition of a successful experiment is “an experiment that taught us something new”. The common answer of “an experiment that confirmed our preconceived notions” (or in A/B testing terms: “an experiment where we shipped the test variation”) is absolutely wrong.

Q7: Do leaders at the company explicitly seek truthful data, even when the truth may expose their mistakes? As with Q4, this may be best probed by asking leaders for examples of cases where they uncovered data that proved them wrong. Startups that harbour a culture of hiding from bad news are best avoided by excellent data people. In my experience and based on countless stories by friends, avoidance of bad news and truthful data becomes more common as companies grow. Great startup leaders care about the success of their business and know that hiding from the truth isn’t going to make it disappear.

Q8: How is uncertainty quantified and communicated? How does it affect decisions? Common sources of uncertainty include sampling biases and missing or wrong data. Marketers are especially notorious for ignoring uncertainty for the sake of memorability (“nine out of ten doctors agree…”). But ignoring uncertainty has long been a way of getting data driven off a cliff. This is at the core of why I recommend the Calling Bullshit book and course to any aspiring data professional. Don’t work with startups that exhibit bullshit failure modes and ignore uncertainty – unless you have the mandate to shape the data culture for the better.

Data-to-AI health beyond culture

This post is part of a series on my Data-to-AI Health Check for Startups. Previous posts:

You can download a guide containing all the questions as a PDF. The next area I’ll cover is Processes & Project Management – aspects of delivery that are more formal than the somewhat-intangible Culture. Feedback is always welcome!

Subscribe

    Public comments are closed, but I love hearing from readers. Feel free to contact me with your thoughts.