>> As of 2025, 9 years after the knowledge pump had been primed, there is still no sign that Cyc would ever achieve general intelligence. The long slow failure of Lenat’s project is a strong indictment against the symbolic-logical approach to AI.
Really? CYC, a strong indictment against SAT solving, though unrelated to it? A strong indictment against automated theorem proving, though unrelated to it? A strong indictment against planning and scheduling, though unrelated to it? A strong indictment against program verification and model checking, though unrelated to it? A strong indictment against Knowledge Representation and Reasoning, though unrelated to it?
Or are those "symbolic-logical approaches to AI"? Some of those fields use heuristic search, but then so does Neural Networks research. What's gradient optimisation? It's a search algorithm that follows the gradient of a function based on the strong assumption that the function has its optimum at the location of the searched-for datum. So, a search algorithm; with a heuristic cost function.
How about probabilistic AI? Is that "symbolic-logical AI"? I mean the Bayesian calculus is a form of logic after all, and probabilities are symbolic right up to the time when they turn into statistics when you have to give them values [1].
How about neural networks again? Everyone will tell you they are "sub-symbolic" but in truth their "continuous" weights are finite-precision floating point numbers, so they're basically symbols, with a very large vocabulary. And in any case Neural Nets, like everything else, are running on a computer, a symbol manipulation device based on Boolean algebra and logic gates. Is that symbolic, logical, or neither?
See what happens when you make up new terminology as you go along and then pretend like it's some kind of agreed-upon definition that everyone agrees upon? Yes I know. My first language is not English either, but the author has clearly gone and er delved into old papers about CYC so I don't see that they have any excuse to make all those unsubstantiated, and unsupported, assumptions about what "symbolic-logical AI" is.
And as for CYC and Lenat: it's easy to disparage a man's work when he's dead and can't defend it. It's easier still if you're just repeating the same talking points that has circulated in the grapevine for who knows how long: blah blah CYC failed, blah blah symbolic AI failed blah. The fact of the matter is that CYC is closed source, very few people can use it, and nobody, outside of Cycorp, knows its true capabilities. A bit like "OpenAI"'s latest "reasoning" models, all we can do is speculate and pretend we know what we're talking about, when we clearly don't. I mean they. The author.
_________________
[1] Which is what you have to do if you want to use probabilities in the real world. And if you feed the Mogwai after midnight they turn into Gremlins.
First, apologies for writing "symbolic-logical AI". It should have been "symbolic-logical AGI".
Second, I agree that the failure of the symbolic-logical approach to AGI is not justified in-text. I believe that it will only be a small component to the first AGI system, as an API for the AGI to use some tools like a calculator, a proof verifier, etc. A way to invoke the "crystalized intelligence" as it is. Nevertheless, since it is not justified in-text, and a justification would probably as a short essay, and is not necessary for the essay itself, it has been removed.
I'm sorry you felt the need to edit your text, it was not my intention to cause you to do that.
They are specifically talking about "general intelligence" here, something which hasn't been achieved and so we can probably classify all attempts towards it as failures at this point.
I don't think that the author meant to write off the entire field of symbolic reasoning because Cyc didn't achieve general intelligence but I think it's fair to say that the lack of progress Cyc made towards it does make it less likely that purely symbolic reasoning will lead us to AGI. That's not to say it might not play a part in AGI or that it doesn't have other uses.