Copilot is often a brilliant autocomplete, that alone will save workers plenty of time if they learn to use it.
I know that as a programmer, I spend a large percentage of my time simply transcribing correct syntax of whatever’s in my brain to the editor, and Copilot speeds that process up dramatically.
That’s on you then. Copilot even very explicitly notes that the ai can be wrong, right in the chat. If you just blindly accept anything not confirmed by you, it’s not the tool’s fault.
Copilot is often a brilliant autocomplete, that alone will save workers plenty of time if they learn to use it.
I know that as a programmer, I spend a large percentage of my time simply transcribing correct syntax of whatever’s in my brain to the editor, and Copilot speeds that process up dramatically.
problem is when the autocomplete just starts hallucinating things and you don’t catch it
If you blindly accept autocompletion suggestions then you deserve what you get. AIs aren’t gods.
OMG thanks for being one of like three people on earth to understand this
That’s on you then. Copilot even very explicitly notes that the ai can be wrong, right in the chat. If you just blindly accept anything not confirmed by you, it’s not the tool’s fault.
AI bad tho!!!