fragmede 1 day ago

There's still skill involved with using the LLM in coding. In this case, o4-mini-high might do the trick, but the easier answer that worry's with other models is to include the high level library documentation yourself as context and it'll use that API.

1
th0ma5 22 hours ago

Whate besides anecdote makes you think a different model will be anything but marginally incrementally better?