palata 7 days ago

The calculator made it less important to be relatively good with arithmetic. Many people just cannot add or subtract two numbers without one. And it feels like they lose intuition, somehow: if numbers don't "speak" to you at all, can you ever realize that 17 is roughly a third of 50? The only way you realise it with a calculator is if you actually look for it. Whereas if you can count, it just appears to you.

Similar with GPS and navigation. When you read a map, you learn how to localise yourself based on landmarks you see. You tend to get an understanding of where you are, where you want to go and how to go there. But if you follow the navigation system that tells you "turn right", "continue straight", "turn right", then again you lose intuition. I have seen people following their navigation system around two blocks to finally end up right next to where they started. The navigation system was inefficient, and with some intuition they could have said "oh actually it's right behind us, this navigation is bad".

Back to coding: if you have a deep understanding of your codebases and dependencies, you may end up finding that you could actually extract some part of one codebase into a library and reuse it in another codebase. Or that instead of writing a complex task in your codebase, you could contribute a patch to a dependency and it would make it much simpler (e.g. because the dependency already has this logic internally and you could just expose it instead of rewriting it). But it requires an understanding of those dependencies: do you have access to their code in the first place (either because they are open source or belong to your company)?

Those AIs obviously help writing code. But do they help getting an understanding of the codebase to the point where you build intuition that can be leveraged to improve the project? Not sure.

Is it necessary, though? I don't think so: the tendency is that software becomes more and more profitable by becoming worse and worse. AI may just help writing more profitable worse code, but faster. If we can screw the consumers faster and get more money from them, that's a win, I guess.

1
nthingtohide 7 days ago

> Back to coding: if you have a deep understanding of your codebases and dependencies, you may end up finding that you could actually extract some part of one codebase into a library and reuse it in another codebase.

I understand the point you are making. But what makes you think refactoring won't be AI's forte. Maybe you could explicitly ask for it. Maybe you could ask it to minify while being human-understandable and that will achieve the refactoring objectives you have in mind.

palata 7 days ago

I don't get why you're being downvoted here.

I don't know that AI won't be able to do that, just like I don't know that AGI won't be a thing.

It just feels like it's harder to have the AI detect your dependencies, maybe browse the web for the sources (?) and offer to make a contribution upstream. Or would you envision downloading all the sources of all the dependencies (transitive included) and telling the AI where to find them? And to give it access to all the private repositories of your company?

And then, upstreaming something is a bit "strategic", I would say: you have to be able to say "I think it makes sense to have this logic in the dependency instead of in my project". Not sure if AIs can do that at all.

To me, it feels like it's at the same level of abstraction as something like "I will go with CMake because my coworkers are familiar with it", or "I will use C++ instead of Rust because the community in this field is bigger". Does an AI know that?

fragmede 7 days ago

With Google announcing that they'll let customers run Gemini in their own datacenters, the privacy issue goes away. I'd love it if there was an AI trained on my work's proprietary code.

fallingknife 7 days ago

Perhaps it will, but right now I find it much better at generating code from scratch than refactoring.