Chris Wigley, Quantum Black

DN met with one of the most pioneering AI companies in the UK, Quantum Black to explore the link between the diversity of data and performance. Enjoy…

Chris, it is great to be here, please could you give our readers an introduction to yourself?

Certainly, my name is Chris Wigley, I am the Chief Operating Officer here at QuantumBlack. I haven’t always worked in technology, in my undergraduate degree I did a mix of Computer Science and History; which is not a combination that often comes up.

I then did a mix of stuff at the BBC for a while, working on the transition from analogue to digital broadcasting, and iPlayer. After that, I spent some time as a diplomat for the UK and spent some time in Afghanistan and Pakistan.

I joined McKinsey in 2009 and kind of grew up here, doing digital technology strategy for a range of organisations; government, social sector, corporate, mostly consumer-facing. Then in 2014-15 spent some time on internal projects looking at McKinsey’s own digital transformation. Out of the back of that, we made a series of acquisitions and a series of organisational changes within McKinsey. One of the acquisitions was QuantumBlack, which at that time was a small company based in Shoreditch with about 35 people. I joined QuantumBlack following the transition, into the leadership team, that was in December 2015 and we’ve now grown to over 300 people. We’re still in London but we’ve also got offices in Boston, Chicago, São Paulo, Delhi and Australia.

What was it about McKinsey that attracted you to the business, because it would appear that Analytics had not been such a prominent theme in your academic path?

It sounds stupid, but you could just apply. You didn’t need to know anyone, it was a very transparent process. It helped that one of my cousins had worked at McKinsey for a couple of years and she said you will learn a ton! She said, “see if you like it and if you don’t then it’s good training to go and do anything else”. So it seemed like a low-risk move, having spent a lot of time in the BBC and in the Foreign Office, it seemed like a very interesting and diverse range of projects that you would be involved in, and also low risk in terms of opening doors rather than closing them further down the line.

With so much interest in and also an investment in AI at the moment, particularly in London, but across the UK as well, how does it feel for you? How do you think it is for the team to be part of such a pioneering business, is there a pressure that comes with that?

We talk about it all the time and say we’re incredibly lucky to be working in a space which is so prominent and fast evolving at the moment. I think it helps that QuantumBlack has been around since 2009, and so we’ve got 10 years of scar tissue in that we have learned the hard way over and over again. We have been able to scale more sustainably having gone through those continuous loops of learning and experimentation and tried to consolidate those into a way of working and an operating model that was scalable, that was relevant to the customers we work with, and hopefully a differentiator compared to what others are doing the space.

Do you find that there has been an increase in understanding of Analytics from the clients that you work with given the rise in interest in the space?

It has been evolving rapidly, even if I rewind just 2 or 3 years ago most of the conversations we had were; “Wow, while this is a really interesting shiny toy, it would be great if you could come in and do a proof-of-concept or work with some obscure corner of the business where we’re not going to break anything!”.

And now a lot of large organisations have been doing these kinds of experiments trying to get them to scale and are struggling. And so, at Davos earlier this year one of the CEO’s said: “We seem to have more pilots in our organisation than our friends at Lufthansa, everything seems to be in pilot and we seem to be partnering with tons and tons of different organisations.”

What we work with clients on now is how do we go from lab to live? How do we go from a tonne of small pilots that don’t scale. In some senses what we had to do ourselves, we had to find a replica way of working. How are you going to bring that talent into your organisation? Having a technology platform that allows you to scale both the data assets and the AI applications that sit on top of them over time and actually get back some value.

Not just technology for technology sake, but for performance. Whether that is individual performance or team performance or complex system performance, having a measurable difference at scale in the organisation.

That leads us nicely into the article that you published on Medium recently*, for those that haven’t read it, could you give us an overview of the article?

The article is around diversity and different types of diversity and how those link to performance in AI. So QuantumBlack has always been anchored around bringing different perspectives and disciplines to bear on solving complex problems. So we always used to draw this three-part Venn diagram on whiteboards that explained what we did, it was bringing together Data Science, Design and Engineering. And those three disciplines are still very much at the heart of what we do.

So we always had this belief that bringing these kinds of disciplines together to solve problems is very powerful and a different approach. The article looks at that approach to diversity and how that impacts problem-solving.

You mention in the article, that in the early days at QuantumBlack you had a 50-50 gender split, is that something you stopped and looked at, and planned for?

I think to some extent it is easier to do that when you are 20 people. It is easier to find 10 great female practitioners in those spaces than it is to find 150. I think to be honest it happened organically, we had job openings, great people applied for them, we hired the best candidates. I think as we then moved into more of a scale-up mode and we were very, very supply constrained, we were trying to hire at speed and scale and in multiple countries. So the 50% that we had which was in our technical roles, has now slipped below the 50%, but we are currently running multiple initiatives with things like meet-ups and conferences around Woman in Data Science to bring that backup.

Can you hire with diversity in mind?

That is a great question. I think you can recruit and be very conscious of your diversity issues and there are multiple companies who are building that into the way that they work. When people come and spend time here is it great, this is a very heavily technical company but there’s no ‘Bro-Culture’. Women are very much part of the fabric of this company and are in prominent roles. We work hard on things like balance and travel, which are important topics in the tech space.

Do you think as an industry we’re doing enough to encourage more females into STEM subjects and subsequently careers within tech?

I think we do a lot more than we used to, thanks to initiatives such as ‘Code First Girls’ and others, that are fantastic at doing that.

But we have come a long way. I have a nine-year-old daughter and she spent two weeks this summer in coding camp and there was no sense of; “oh look she is a girl and she is doing coding”. It didn’t even come up as a topic, and the group of kids in the coding camp were very diverse. We see that diversity continues through A levels, and core subjects like maths which are essential for people seriously looking to get into AI.

There is still work to be done around encouraging girls and women to apply for those subjects at university, and I think some campuses could still be doing more on that in Computer Science and other departments.

Then I think a lot of companies in our peer group are actually bending over backwards trying to hire more women candidates, with very meaningful efforts underway to try and do this, but I think there’s that kind of gap in middle, where something happens somewhere between GCSEs and A Levels, and people entering the workplace.

But I also believe it is going to take time to change. If you want more women coming out of PhD programmes in Computer Science and Maths, they have been going into those projects four years before, and Masters programmes two years before that, so it is going to take time for this new system to pull through.

People diversity is not just limited to gender but also age. You mentioned that you have great age diversity here at QuantumBlack, do you see almost a voluntary mentoring system within the team, and does that help increase performance?

Absolutely. Age diversity is a really big thing for us. Unlike many small scaling tech companies, our CEO had over 30 years of very senior business experience before joining QuantumBlack and so the experience he brings of scaling enterprises, leading people, dealing with senior leaders, and that scar tissue is invaluable for us, and gives us a real stable sense of leadership, of mission, which is vital.

Equally on the technical side, many of our both client facing and technical leaders are older, but I think the concept of reverse mentorship is also super important. I was last coding seriously probably 15 years ago and I’ve just recently persuaded a new employee to reverse mentor me and get me coding again in Python rather than in PHP or JavaScript which I was coding in 15 years ago.

I think there is this sense of a flywheel where we can all learn from each other. People with less experience can learn from those with more experience. But also those with more experience can learn from those with less, we get that fresh perspective of those who have grown up with a modern tech stack.

We hear so often that Data Science is a mindset, a curiosity. You mention in the article the importance of Data Diversity, is that curiosity something that you can teach? In academia for example?

I think so, I think the mindset is part of it, it’s almost an attitude or a series of habits as well though. The black part of QuantumBlack was always that we would be the Black Ops of Data and Analytics, that we could parachute and find the data wherever it was or create data if it didn’t exist, then build models that work on top of it, and I think that sense of being slightly hacky is still really valuable to us.

In the early days of QuantumBlack when we were coming out of Formula 1, Jacomo who is our Chief Scientist, was the strategist for one of the F1 teams, and they were using machine learning to try and win more races. They were trying to understand better, how the other drivers were driving in order to make better decisions during the race. And they thought about it and realised that there is now a camera in every car. So, if you go to the broadcast stream you get the 20 different channels of each car and actually, you can interpolate that audio and understand from the audio how the others drivers are driving. So actually, if you create that data you can feed that into the decision-making, we can better understand how the other drivers are driving.

In a different context, we were working with a pharma company, on a clinical trial project around patient safety, which is one of the stories I touch on in the article, we exhausted all of the normal data sets and the models weren’t performing and so really, it’s back to that whiteboard moment of what other data is around?

It turns out when anyone visits a clinical trial we have to run a report. What does the report look like? Well, it’s a PDF, on people’s laptops. OK, do laptops get backed up to the cloud? Turns out the laptops do get backed up to the cloud. Is there a text string in all of them that would identify them as a clinical trial report? Yes, well then, we can locate all of those files, we can extract all of the text, we can geo-stamp it, time stamp it, and suddenly we have a new data set that is incredibly rich, and gives a new perspective on that model.

I would assume that in your line of work there is a heavy reliance upon your clients to have that data readily available and ready for production?

In some ways, but I would go as far as to say that no client in the world ever has perfect data, everything is a mess. And it is terrifying how quickly systems can become legacy. So even digital companies that were formed in 2004, for example, are already complaining about having legacy systems. It isn’t just banking or retail systems from the 1970s. So the data is always messy and it’s always the case of; “How do we have an agile process?”, saying; “Right what can we do now with the data as it is? How can we enrich this with external data?”.

Whether that is something as simple as weather, or that’s consumer data, whether it’s other shipping, economic, political, market or financial data, how can we bring that to the internal data? We worked with an industrial company in India who were really interested in understanding how they can use AI, and how they could improve the performance of their massive infrastructure site. We said we can do this for you in 18 months, but you need to instrument yourselves up now, RFID tags into the helmet, start collecting better data using drones to get visual imagery of the sites and 18 months later that work is now underway. So as a general comment, we should never feel hampered by the natives, we should be creative about how we can find a way through.

During the article, you mentioned you can often see bias in someone’s Algorithm. Could you bring that to life for us?

Absolutely, I think this is a very interesting and fluid space at the moment, so I think first you need to differentiate between bias and fairness, and so bias in a dataset or a model is unrepresentative of the population that it is trying to model.

In the book, ‘Machine Learning Yearning’, which is being released chapter by chapter at the moment uses a kitten recognition app as the anchor to explain the concepts he is talking about.

He says if you’re training the app on images of kittens from the internet, they may look quite different from the images that you take of your kitten on your phone using the app so the model will not perform. So that is an example of bias issues.

Whereas fairness issues can be related to bias, I think it is distinct. A classic example here is Police forces using predictive algorithms to suggest who they should stop and search. Because historically some communities have bad outcomes, members of those communities are more likely to be flagged by a naïve algorithm, which is why you need to be careful: even though a model can be performing in line with a dataset that accurately reflected historical truth, it can at the same time be unfair and inappropriate.

So, to address both of them separately is important. First, on the bias point we can look at that mathematically; are we sampling data in the right way? Are the cohorts big enough? Is the population we are talking about big enough? And you can use multiple random resampling to get more cohorts out of smaller datasets and so on. In fairness, it’s often a question of stepping back and also of involving diverse points of view.

When we work in very heavily regulated industries like finance, pharmaceuticals, aerospace and so on, we also are very thoughtful as we’re developing algorithms working on datasets with our clients around how can we build transparency into what we are doing so that we can explain this to the regulator. And often we will have interesting conversations around trade-offs between performance and transparency so there might be a neural net model which predicts more accurately the outcome on; “Is someone more likely to default on their card if we give them a loan if we are a bank?”. But actually, we can use a different type of model, like a random forest model which can more easily explain what the drivers are that classified what put someone in group A or B, even though it’s got a slightly lower level of performance, we might choose to put that model into production because it’s explainable in a way that the neural net is not.

Although a lot of our R&D work at the moment is around how can we bring explainable AI techniques to various kinds of neural net models that we work within a deep learning solution.

What would be your stand out example of diversity having a positive impact on performance?

I was in offsite recently in India and we were talking about how we could use technology to bring affordable improvements to rural populations in India. We were in a small village. We had the very old Founder of this company, we had his children who are now running the company, we had strategy consultants, we had technologists, we had patients, we had rural economists probably about 25 people in this village for two days. It was rather extraordinary and I think the outcomes will be incredibly exciting.

Not any of the individual elements of that group could never have got to the same place that we got to collectively.

How has the landscape of AI changed in your time at QuantumBlack?

For me personally, it is so exciting to be a part of. I think the changes are mainly positive.

The industry has come a long way, if I think back to 1998-99 to now, I think the centrality of tech in society has grown enormously. I think the benefits to humankind have been enormous. We now take for granted global connectivity, that a farmer in an emerging market will have a smartphone. These things would be unthinkable 15 years ago and so I do think the benefits of that changing have been huge, and it’s a really exciting time to be part of this industry.

What advice would you give someone who is looking to pursue a career within Computer Science?

If I think about the different types of profiles that we get at QuantumBlack, we get very, very different profiles. So my first point would be there is no single right answer.

I think the thing that cuts across all of this is, and it sounds a bit trite, but I think is true, is “learn how to learn.”

One of the things that I spent a lot of my time doing in my masters, which was in history, was looking at loads of Renaissance travel writers and when I was doing other roles later in life, people would say “oh he did history, that is so irrelevant.” And I would say well it involved looking at different conflicting views on a topic, trying to navigate through those, trying to make sense of them and then trying to communicate what your perspective was on them and actually those are skills that last a lifetime.

Secondly, I think you need to be able to clearly communicate complex ideas simply to synthesise complexity into a point of view. Having that as a muscle is extremely useful. For people who have gone deeper down the Maths route, who are hearing “Data Scientists will be obsolete in over 10 years because you know machine learning algorithms will be running the software, so we won’t need any Data Scientists anymore”, I’m not convinced that someone with a deeply mathematical background who is used to using mathematical concepts to solve problems is ever going to find that that is an irrelevant skillset to have. Equally, and in a very different skillset, if I think about our user experience and interface designers, they get very deeply into the human experience and even have Anthropology backgrounds.

That creativity and empathy with other humans is only something that is going to become more and more relevant. So I don’t think it’s all about learning to code and doing computer science, I do think that computational thinking is an extraordinary life skill and so it’s helpful to learn to code for that reason, I don’t think it is useful to learn to code because people should just be able to code in Python per se. It is about embedding a way of thinking. And the ability to keep learning throughout life.

Where can we find you or the QuantumBlack team over the rest of the year?

We do have an event at the Royal Institution on the 10th of October, with Diana Biggs, Alice Breeden and Gina Neff, which is on Diversity of Perspective. So it would be great to see you all there!

*Article referenced in Medium: