X
Business

Big data ethics is a board-level issue

Big data analytics brings on privacy and ethical concerns. What is your organisation actually doing about them? Does your board understand the risks?
Written by Stilgherrian , Contributor

There was something both pleasing and irritating about many of the presentations at the APIdays Sydney conference held earlier this week. Amongst all the talk of opening up your organisation's data through an application programming interface (API) and subjecting it to big data analytics, most of the speakers I listened to acknowledged that it raised privacy and ethical issues. That's great, but those issues never seemed to be explored beyond that acknowledgement.

As the conference wore on, and my irritation began to outweigh my pleasure, I decided to challenge a speaker about this. It just so happens that I chose Simon Dorrat, manager of strategy, architecture, and innovation for Toyota Australia's information systems division -- but really, I could have picked anyone.

Dorrat had just outlined the opportunities that might flow from what he called the "rivers of data" pumped out by vehicle telematics. Faults could be detected earlier, meaning they could be fixed earlier, reducing maintenance costs. Fleets of vehicles could be better managed. Stolen vehicles could be tracked. Driving patterns could be analysed to improve fleet efficiency, and warn about dangerous drivers. And much more.

This data could even be used for targeted marketing, he said. If a driver regularly hauls a heavy boat, trailer, or caravan, for example, that would lead to heavier braking. That could be detected, and the driver offered an upgrade to a heavy-duty brake kit.

Dorrat's presentation included a slide listing the challenges it would face, and that included the need to "address ethical and privacy concerns". So at the end of his presentation, I raised my hand.

"What organisational structures and processes do you have in place to ensure that those issues are on the table at every stage, if any?" I asked.

Dorrat mentioned Toyota's corporate compliance committee, which looks at regulatory aspects -- including privacy and consumer law. Given the many regulatory issues that a carmaker needs to deal with, I daresay that committee's processes are quite mature. But beyond that, it sounds like things are ... well, less mature.

"From the point of view of understanding how customers will react to those [data] decisions, that's something that's new for us, because we're really just starting to do this. So the processes there, we'll need to develop [them] in consultation with customers to work out the best approach," he said.

"I think it's going to be a slowly-slowly sort of thing. Like many in the industry, we're going to have to learn as we go. And if we offer value, then the willingness to share information will change accordingly."

In other words, they've got nothing -- at least for now.

But Toyota already has 75,000 vehicles on the road in Australia that are collecting data. Toyota itself may not be doing anything with that data yet, but drivers are already sharing it out through third-party apps on their smartphones.

Given that Toyota set up the framework for collecting this data in the first place, and given that it takes years to design new systems and incorporate them into production vehicles, I'd contend that the company is a bit behind the pace.

Shouldn't you consider the ethical and social impacts of a new product before putting it on the market, rather than treating them as marketing and customer service issues after the fact?

Now, let me stress again that I'm not having a go at Toyota. It's just my random example for today. Most large manufacturing organisations are probably in much the same position. They've never been in the data business before. It's all brand new to them.

And that's the problem. This is yet another example of technology being deployed before its social implications have even been considered, let alone understood. That seems to be the inevitable way of us humans. Fine. But that also means it's inevitable that we'll experience some ethical, social, and privacy oopsies before we get it right.

That means at least some players will have a nasty mess to clean up. Assessing the risk of causing such a mess, and deciding whether to take on that risk, is, as always, something for the board to consider.

If a big, established firm like Toyota Australia is still -- and I mean no disrespect -- making it up as it goes along, what about some of the more feral, fast-growing newcomers? What about your own organisation?

Editorial standards