Mind Matters Natural and Artificial Intelligence News and Analysis

TagGary Smith

prosthetic arm fist.jpg
Prosthetic robotic arm with palm in fist, 3d rendering on black background

Nobel Prize Economist Tells The Guardian, AI Will Win

But when we hear why he thinks so, don’t be too sure

Nobel Prize-winning economist (2002) Daniel Kahneman, 87 (pictured), gave an interview this month to The Guardian in which he observed that belief in science is not much different from belief in religion with respect to the risks of unproductive noise clouding our judgment. He’s been in the news lately as one of the authors of a new book, Noise: A Flaw in Human Judgment, which applies his ideas about human error and bias to organizations. He told The Guardian that he places faith “if there is any faith to be placed,” in organizations rather than individuals, for example. Curiously, he doesn’t seem to privilege science organizations: I was struck watching the American elections by just how often politicians of both…

Artificial intelligence concept. Robotic hand is holding human brain. 3D rendered illustration.

Failed Prophecies of the Big “AI Takeover” Come at a Cost

Like IBM Watson in medicine, they don’t just fail; they take time, money, and energy from more promising digital innovations

Surveying the time line of prophecies that AI will take over “soon” is entertaining. At Slate, business studies profs Jeffrey Funk and Gary Smith offer a whirlwind tour starting in the 1950s, with stops along the way at 1970 (“In from three to eight years we will have a machine with the general intelligence of an average human being”) and at 2014: In 2014, Ray Kurzweil predicted that by 2029, computers will have human-level intelligence and will have all of the intellectual and emotional capabilities of humans, including “the ability to tell a joke, to be funny, to be romantic, to be loving, to be sexy.” As we move closer to 2029, Kurzweil talks more about 2045. Jeffrey Funk and…

Female doctor consoling senior woman wearing face mask during home visit

#8 in our AI Hype Countdown: AI Is Better Than Doctors!

Sick of paying for health care insurance? Guess what? AI is better ! Or maybe, wait…

Merry Christmas! Our Walter Bradley Center director Robert J. Marks has been interviewing fellow computer nerds (our Brain Trust) Jonathan Bartlett and Eric Holloway about 12 overhyped AI concepts of the year. From AI Dirty Dozen 2020 Part II. Now here’s #8. Sick of paying for health care insurance? Guess what? AI is better! Or maybe, wait… https://episodes.castos.com/mindmatters/Mind-Matters-114-Jonathan-Bartlett-Eric-Holloway.mp3 “Is AI really better than physicians at diagnosis?” starts at 01:25 Here’s a partial transcript. Show Notes and Additional Resources follow, along with a link to the complete transcript. Robert J. Marks: We’re told AI is going to replace lawyers and doctors and accountants and all sorts of people. So, let’s look at a case of the physicians. This was a piece…

Digital binary code matrix background in graphic concept

Smith and Cordes’ Phantom Pattern Problem A Top 2020 Book

Published by Oxford in 2020, it deals with the “patterns” Big Data throws up that aren’t really there

David Auerbach has picked The Phantom Pattern Problem (2020) by Gary Smith and Jay Cordes as one of the top books of 2020 in the science and tech category. Auerbach, who describes himself as “a writer and software engineer, trying to bridge the two realms,” is the author of BITWISE: A Life in Code (2018). He has an interesting way of choosing books to recommend: Those that resist the “increasingly desperate and defensive oversimplification” of popular culture: I hesitate to mention too many other books for fear of neglecting the others, but I will say that of the science and technology books, several deal with subjects that are currently inundated with popularizations. In my eye, those below are notably superior…

Quitting smoking - male hand crushing cigarette

Yellow Fingers Do Not Cause Lung Cancer

Neurosurgeon Michael Egnor and computer engineer Bob Marks look at the ways Big Data can mislead us into mistaking incidental events for causes

It’s easy to explain what “information” is if we don’t think much about it. But what if we ask a student, what does your term paper weigh? How much energy does it consume? More or less matter and energy than, say, lightning striking a tree? Of course, the student will protest, “But that’s not the point! It’s my term paper.” Exactly. So information is very different from matter and energy. It means something. Realizing that information is different from matter and energy can help us understand issues like the difference between the causes of a problem (causation) and circumstances that may be associated with the problem but do not cause it (correlation). In last week’s podcast, “Robert J. Marks on…

Robot Close Up

AI Is Not Nearly Smart Enough to Morph Into the Terminator

Computer engineering prof Robert J. Marks offers some illustrations in an ITIF think tank interview

In a recent podcast, Walter Bradley Center director Robert J. Marks spoke with Robert D.Atkinson and Jackie Whisman at the prominent AI think tank, Information Technology and Innovation Foundation, about his recent book, The Case for Killer Robots—a plea for American military brass to see that AI is an inevitable part of modern defense strategies, to be managed rather than avoided. It may be downloaded free here. In this second part ( here’s Part 1), the discussion (starts at 6:31) turned to what might happen if AI goes “rogue.” The three parties agreed that AI isn’t nearly smart enough to turn into the Terminator: Jackie Whisman: Well, opponents of so-called killer robots, of course argue that the technologies can’t be…

3D Rendering of binary tunnel with led leading light. Concept for data mining, big data visualization, machine learning, data discovery technology, customer product analysis.

The Brain Is Not a Computer and Big Data Is Not a Big Answer

These claims are mere tales from the AI apocalypse, as George Gilder tells it, in Gaming AI

In Gaming AI, George Gilder (pictured) sets out six assumptions generally shared by those who believe that, in a Singularity sometime soon, we will merge with our machines. Some of these assumptions seem incorrect and they are certainly all discussable. So let’s look at the first two: • “The Modeling Assumption: A computer can deterministically model a brain.” (p. 50) That would be quite difficult because brains don’t function like computers: As neuroscientist Yuri Danilov said last year, “Right now people are saying, each synoptical connection is a microprocessor. So if it’s a microprocessor, you have 1012 neurons, each neuron has 105 synapses, so you have… you can compute how many parallel processing units you have in the brain if…

Artificial Intelligence. Composition on the subject of Future Technologies. 3d rendered graphics.

Why AI Geniuses Haven’t Created True Thinking Machines

The problems have been hinting at themselves all along

As we saw yesterday, artificial intelligence (AI) has enjoyed a a string of unbroken successes against humans. But these are successes in games where the map is the territory. Therefore, everything is computable. That fact hints at the problem tech philosopher and futurist George Gilder raises in Gaming AI (free download here). Whether all human activities can be treated that way successfully is an entirely different question. As Gilder puts it, “AI is a system built on the foundations of computer logic, and when Silicon Valley’s AI theorists push the logic of their case to a “singularity,” they defy the most crucial findings of twentieth-century mathematics and computer science.” Here is one of the crucial findings they defy (or ignore):…

Crypto currency background with various of shiny silver and golden physical cryptocurrencies symbol coins, Bitcoin, Ethereum, Litecoin, zcash, ripple

Has COVID-19 Helped or Harmed Crypto and Blockchain?

Cryptocurrencies rebounded after an initial slump earlier this year

The recently aired discussion at COSM about the future of bitcoin and other privately minted cryptocurrencies took place last October, before COVID-19 was much thought of in the Western world. Catching up, the cryptos and blockchain had a rough ride earlier this year but they have stabilized recently. In February, as the pandemic sent markets scurrying, things were looking grim for the cryptos: During the last week, the spread of the coronavirus has been all over the news; the virus, which had remained well-contained in China, spread throughout South Korea, Iran, Italy, and is now reaching its fingers into other parts of Europe. The New York Times reported on Thursday that “the signs were everywhere…that the epidemic shaking much of…

yellow cubes.jpg
Abstract 3d render, geometric composition, yellow background design with cubes

Interview: New Book Outlines the Perils of Big (Meaningless) Data

Gary Smith, co-author with Jay Cordes of Phantom Patterns, shows why human wisdom and common sense are more important than ever now

Economist Gary Smith and statistician Jay Cordes have a new book out, The Phantom Pattern Problem: The mirage of big data, on why we should not trust Big Data over common sense. In their view, it’s a dangerous mix: Humans naturally assume that all patterns are significant. But AI cannot grasp the meaning of any pattern, significant or not. Thus, from massive number crunches, we may “learn” (if that’s the right word) that Stock prices can be predicted from Google searches for the word debt. Stock prices can be predicted from the number of Twitter tweets that use “calm” words. An unborn baby’s sex can be predicted by the amount of breakfast cereal the mother eats. Bitcoin prices can be…

Call for papers (Cfp, science)

From Nature: A New, Topflight Computer Science Journal

Starting in January 2021, it proposes to tackle a key problem in computer use in science - replication of findings

The Springer Nature Group is launching a new online-only journal,Nature Computational Science. It is described as a “dedicated home for computational science” and we are told: Recent advances in computer technology, be it in hardware or in software, have revolutionized the way researchers do science: problems that are too complex for human or analytical solutions are now easier to address; problems that would take years to solve can now be unraveled in days, hours, or even seconds. The use and development of advanced computing capabilities to analyse and solve scientific problems, also known as computational science, has undoubtedly played a key role in transformational scientific breakthroughs of our last century, making progress possible in many different disciplines. Elizabeth Hawkins, “A…

close-up view of robot playing chess, selective focus

Bingecast: Robert J. Marks on the Limitations of Artificial Intelligence

Robert J. Marks talks with Larry L. Linenschmidt of the Hill Country Institute about nature and limitations of artificial intelligence from a computer science perspective including the misattribution of creativity and understanding to computers. Other Larry L. Linenschmidt podcasts from the Hill Country Institute are available at HillCountryInstitute.org. We appreciate the permission of the Hill Country Institute to rebroadcast this…

White swan mask on black wooden surface. Empty space.

New Book Takes Aim at Phantom Patterns “Detected” by Algorithms

Human common sense is needed now more than ever, says economics professor Gary Smith

Pomona College economics professor Gary Smith, author with Jay Cordes of The Phantom Pattern Problem (Oxford, October 1, 2020), tackles an age-old glitch in human thinking: We tend to assume that if we find a pattern, it is meaningful. Add that to the weaknesses of current artificial intelligence and “Houston, we have a problem,” he warns: The scientific method tests theories with data. Data-mining computer algorithms dispense with theory and search through data for patterns, often aided and abetted by slicing, dicing, and otherwise mangling data to create patterns. Gary Smith, “Phantom patterns: The big data delusion” at IAI News (August 24, 2020) Many of the patterns so detected are obviously spurious, for example: A computer algorithm for evaluating job…

crypto currency concept

Is Crypto Just a Flash in the Pan?

Or, to put it more bluntly, will blockchain ever grow up to be a real financial system? Forbes says yes, cautiously
Will blockchain and other non-government currencies ever grow up to be a real financial system? What about the weird Canadian crypto uproar in which the only a dead man knows the code to release the missing millions? Read More ›
Close-up view of the Difference Engine

Lovelace: The Programmer Who Spooked Alan Turing

Ada Lovelace understood her mentor Charles Babbage’s plans for his new Analytical Engine and was better than he at explaining what it could do

Turing thought that computers could be got to think. Thus he had to address Lovelace’s objection from a century earlier, that they could not be creative.

Read More ›
technical financial graph on technology abstract background

Is Moore’s Law Over?

Rapid increase in computing power may become a thing of the past

If Moore’s Law fails, AI may settle in as a part of our lives like the automobile but it will not really be the Ruler of All except for those who choose that lifestyle. Even so, a belief that we will, for example, merge with computers by 2045 (the Singularity) is perhaps immune to the march of mere events. Entire arts and entertainment industries depend on the expression of such beliefs.

Read More ›
Man typing on keyboard background with brain hologram. Concept of big Data.

Which Career-Limiting Data Mistake Are YOU Most at Risk For?

Award-winning data science author Gary Smith says the odds depend on your relationship to the data

Dr. Smith thinks that the most dangerous error is putting data before theory. Many data-mining algorithms that are now being used to screen job applicants, price car insurance, approve loan applications, and determine prison sentences have significant errors and biases that are not due to programmer mistakes and biases, but to a misplaced belief in data-mining.

Read More ›
Group Of Businesspeople Identified By AI System

How To Fool Facial Recognition

Changing a couple of pixels here and there can stump a computer

Both computers and humans can be fooled by patterns that appear significant but really aren’t. But the bigger the computer, the more random patterns it can find in the vast swathes of data processed.

Read More ›
The coins are stacked on the ground and the seedlings are growing on top, the concept of saving money and financial growth.

2019 AI Hype Countdown #4 Investment: AI Beats the Hot Stock Tip… Barely

At the end of the day, AI-based investing actually performed like a bad index fund

Artificial intelligence may do well summarizing data, but the new insights that will lead the economy forward cannot be gleaned that way. What we need is not old data but new truths.

Read More ›

Bingecast: Is Cheese Consumption Causing Deaths from Tangled Sheets?

Those dealing with data must always remember “If you torture data long enough, it will confess to anything.” The answers that computers give must themselves be questioned. Robert J. Marks and Gary Smith address artificial intelligence, spurious correlations, and data research on Mind Matters. Show Notes 01:34 | Introduction to Gary Smith, the Fletcher Jones Professor of Economics at Pomona…