Irony Disclosure: Dr. Mani Sivasubramanian and I shared this discussion on Facebook – between pictures of food and cats.
“We are not only what we read. We are how we read.” Maryanne Wolf
Declan Dunn: Is Facebook making us stupid too? How about AI?
What makes “us” before, smarter compared to now – are new habits a step back or evolution?
Is this short attention span reading habit a way of dumbing things down, or is that another luddite over reaction to change?
Interesting to read how this journalist’s research led him away from reading longer books. Moving away from longer pieces and into this staccato, in and out reading of the social media world…and this was well before Facebook got ginormous.
Great insight in the article; when Nietzsche learned to write on a typewriter, it changed his prose forever, because the machine impact on how he communicated. Anyone who has an ability to type knows that feeling, where the words flow faster than you consciously think them.
Does this stop people from reading books, focusing on shorter bits of information in a limited time frame, including the author of this piece?
I don’t think anything really makes us stupid, yet what we are all in the middle of, this social mobile dynamic up and down and never stopping flow of thoughts photos and ideas without periods….
shifting how we communicate, from people to businesses, and it is dynamic.
Also learning how Google’s ultimate plan is to become AI, already on the way, I knew that but never at the level shared in this article….Elon Musk is afraid of AI, and Larry and Sergey want it to take over, at least what they’ve got….
Dr. Mani Sivasubramanian Declan – my perspective on this shifts as I read Gautam Shroff’s The Intelligent Web (I’m half way through). I’m sure you’ll enjoy this book, which essentially is about “big data”, but along the way explores AI and machine learning, comparing and contrasting it against “human learning” and how biological/neurological systems acquire and process new data, turning it into information. I’m finding it fascinating so far smile emoticon
Declan Dunn I’ll check that out – reminds me of epigenetics, we’re finding so many new ways to acquire data. Love this “flat earth” moment.
Dr.Mani Sivasubramanian The guiding thesis seems to suggest that we’re leveraging machine intel and a humungous, interconnected global repository of data in a way that has been hiterto impossible – and using that “searchable” database to avoid remembering and “learning” some things, to focus instead on a higher level of processing data to come up with more valuable INFORMATION.
A representative passage:
“Almost anything one can think of is likely to have some presence on the web, in some form at least, however sparse or detailed. In many ways we can think of these 50 billion web pages as representing, in some sense, the collective experiences of a significant fraction of mankind—a global memory of sorts… So much so, that we are gradually ceding the need to remember things in our own memories and instead relying on searching the global memory using web search.”
Dr.Mani Sivasubramanian Also, just yesterday, I recall a friend posting a note about a “secret room” where Elon, Sergey and Brin meet to discuss things! Guess there must have been a few “wars” in there about the future of AI
Declan Dunn Maybe that’s where the fear came from….if search is the future of AI….something ironic there.
We search to find what we don’t know, AI helps us know more so we don’t have to search so much.
Dr.Mani Sivasubramanian The distinction made in Shroff’s book (the part I’ve read) between “silicon memory” and AI seems to suggest that real AI is still a bit far away – but could get “real” faster than we might expect. The competitive edge of human intelligence seems to come from the extensive parallel processing neuronal network, which has no parallel in machine computing. In fact, I was stunned when he says, in one place, that today, Google’s million+ servers and all connected computing on the Internet “might match” the parallel processing power… of ONE human brain!
Declan Dunn the whole guessing game of singularity, when that moment will happen, fascinating.
Maybe we should stop trying to create machines when it’s our own technology within us we should be emulating.
Be good to really understand all the amazing things our DNA can/is doing as well – when silicon memory meets organic it would speed up, right?
Because organic is almost always more powerful than what we can recreate….see every pharma drug out there, the organic versions are so much more potent and healthy.
I love discussing AI and watching my own parallel processor go from silicon memory to phrama drugs to naturopathic medicine…and it all makes sense to me.
Dr.Mani Sivasubramanian That’s what I find most fascinating about Shroff’s book.It’s the first where a direct comparison is drawn between biological “learning” and how machines have gone about it – and showing how DIFFERENT the two approaches are!
Or maybe there are core similarities we’re not yet aware of – but in essence, it appears as if “machine learning” is NOT just a speeded up “biological learning” process at all, but something quite distinct. Should it start to attempt mimicking “natural” processes to become more efficient? Can it, given our lack of (or limited) understanding of parallel processing? Will the hybrid really be a super-intelligence? Who can tell?
Dr.Mani Sivasubramanian Aha! I find this towards the end of the Atlantic article:
“When the mechanical clock arrived, people began thinking of their brains as operating “like clockwork.” Today, in the age of software, we have come to think of them as operating “like computers.” But the changes, neuroscience tells us, go much deeper than metaphor. Thanks to our brain’s plasticity, the adaptation occurs also at a biological level. “
Declan Dunn I wonder if computers think of operating like parallel processors.
Dr.Mani Sivasubramanian Or if brains will “evolve” to become like computers!
Dr.Mani Sivasubramanian Also, from this Atlantic piece, I’m wondering if the issue isn’t really a perceived “opportunity cost” of investing too much of a precious, limited resource (one’s time) on one or few pieces of content, rather than a habitual inability to concentrate or focus on something lengthy or deep? I mean, it’s taking a risk to deep dive into any article or blog post – and if you’re concerned that wasting time on one might keep you from finding another that’s more fascinating, wouldn’t that drive you to skim rather than read? And if you’re sure something is worth getting deeper into, you’ll effortlessly switch gears to go ‘slow and deep’? I believe that’s what I would do. You?
Declan Dunn Yes, though it takes more for me to go deep than it used to , in terms of selecting content.
I also see similarities to how we read, speed reading is not deep, it’s fast, and I’ve read Faulkner speed reading and still got a ton out of it. In fact, with that particular author, I found it hard to go deep, it just spun me in too many directions.
I don’t have trouble immersing myself, though my evaluation of whether it’s worth my time comes up a little quicker.
Like Human 3.0, I was reading that book but for me, it kept repeating the same premise, felt like I’d been there before. It just bogged me down, and maybe if I just skipped the intro I already knew I’d like it, but it just disappointed me because I want the new to show up a bit earlier, not wait till the almost last chapter to make a point.
I think the narrative, the way we write, is become more transient and like memory, shifting in time and perception. The idea that you have to build up a massive bulk of information before releasing the secret…that’s shifting as well.
Mix it up, and really, we’re learning the power of editing even more.
Dr.Mani Sivasubramanian Right. It’s a different communication STYLE. Kind of like newspaper reporting, go for the jugular right out of the box – or else you lose the reader!
Declan Dunn And don’t repeat what’s so already known….tricky.
Are Google and Facebook indeed making us stupid, a new intelligence, a little of both?
What do you think?