White Papers vs the Real World of Machine Learning, Big Data, Blockchain, and artificial intelligence
Artificial Intelligence, or really machine learning, is all the hype now. Everybody is doing it. Everybody has it. Everybody needs it. Everybody wants it. It'll change the world. It'll disrupt. It'll make you rich. It'll save your hide. It's better than the discovery of fire and sandwiches.
Throw in some smart bots, big data, Spark, cloud, and cognitive computing and you hit the grand slam of them all. Throw up a couple of cool white papers around Neural Networks and Linear Regression, an article or two around tensorflow or Theano or torch or caffe and you're good to go, even though tensorflow has become the defacto 'everyone is using it these days' tool. Or so you would think they are until you start talking to people who are really using it.
Silicon Valley and a few other areas are indeed 'all in' when it comes to Neural Networks, AI, machine learning, big data, etc. But many of the white papers you read or webinars and meetups that you attend are by people who work for companies that don't actually make money. Or they work for Google or facebook, which is almost like not working in the real world anymore. Or it sounds far more like a resume builder than talking about something they did that made an impact. I'm sure many of them do make huge impacts, but you rarely hear about it. I've attended some of these, listened and read And many are new. New as in the company existed for a year or less and are disrupting and changing the world. Except many are not much different than Theranos. Maybe not at that level, but at the level that all the cool things they do haven't made them any money. It just got them more VC. Great if you can do it, pointless to the rest of the world when it comes to use cases or usage of those algorithms.
Hadoop and even AWS has had that problem for years. Most of the corporate and government worlds want easy buttons. Nobody wants to spend a year figuring out how to actually use the right tools in AWS. Or what open source tool is the latest and greatest in the Hadoop ecosystem. And how can they get their staff to actually learn it while still working a '9-5' job. Many want to be innovative, but they also follow slow processes, procedures and policies. And while many of us, including myself, dislike some of these slow DMV like procedures and documentation issues, the other reality is none of these places can really live in a world where a billion dollar Moonshot is just shrugging your shoulders.
I mean imagine if anybody else came up with Google Glasses and failed so miserably? IBM has been a disaster of 18 consecutive losing quarters. But they still make billions. Imagine if they came up with Google Glasses and just shrugged it off. People would be killing them. People already are for Watson and how it's not making them money. Google gets away with many things. There Pixel phone with the terrible commercials. Microsoft was made fun of for their disastrous surface and windows phone commercials and product placements. Google gets a pass, but their commercials are even worse. And kind of full of themselves.
But back to the original point, people who work for Google can get away with implementing and playing around with all these cool AI and machine learning and robotics and everything else things. Most other people in the corporate or 'real world' can't spend six months on deeper learning algorithms and come back with, hey Google + was a huge failure, Hangouts is being fazed out, our Image recognition was racist and often not right, our Nest thermometer was a recall nightmare, the Motorola purchase was a bankrupt like failure, but hey Moonshot, so what.
That's a lot of so what's that nobody else can get away with. But it's also a lot of so what's that the rest of us can learn from, utilize and try and push forward some real successful results for the rest of the corporate and government worlds. Instead we get a ton of white papers that go even further into moon shot theories. I've read a ton of white papers, some I don't even understand half of it or any of it really. But many of them are great in theory, great for some PhD, great for some moonshot Google project, but rather useless in the real world.
I mean I've read papers that went into detail on how to implement everything in Matlab. Really? And you want people to do that in the Corporate World? And it gets even more interesting when the entire white paper is great for comparing algorithms and models, but doesn't even answer if it worked for a business. It pretty much was a love-fest on what algorithm to implement versus actually being useful to some business entity.
We need far more useful use cases and less resume building look at how smart I am white papers. There are some best practices and strategies to use, but forcing what works at some POC in college isn't the same as doing it for some big corporation who has been plugging away for 40 years. They might not exactly be doing things right, but then again, being many of the silicon valley startups don't really make any money either, the only thing they are doing better is conning people into giving them VC money and not marketing to people to actually use what they've built.
Next time I will talk about Neural Networks and using AWS to implement some cool things that are actually useful for real businesses, not just the googles and Stanford's of the world.