Microsoft is breaking its open and extension-friendly ethos with VSCode in order to cripple GitHub Copilot competitors with restricted APIs.
Microsoft is breaking its open and extension-friendly ethos with VSCode in order to cripple GitHub Copilot competitors with restricted APIs.
I wouldn’t touch AI tooling with a ten-foot pole. Sooner or later, someone’s gonna get sued for copyright infringement because they plagiarized someone else’s code via Copilot, and I don’t wanna be that someone.
At least the way I use AI tooling, it would be very difficult to accidentally plagiarize code outside of boilerplate I guess. The way tools like GitHub Copilot are really useful is when using your existing code as “reference”, so if you use it to write a simple method, it’s only going to be restructuring your existing code.
In other words, Copilot is basically just meant to be used as really smart copy-paste.
Plagiarizing boilerplate is still plagiarism.
If it only used my own code, I’d be less reluctant to use it, but I was under the impression that it’s trained on numerous projects, not just mine.
Also, if it only used my own code, would it be useful? I very rarely have to write the same code twice.
Might be cool to actually have a pattern copy pasting for those 10 main use cases, I feel like using a LLM just to repeat some boilerplate is such a waste.
One case where the LLM is really useful is when generating some basic comments, but to be fair 50% of the time my comments explain why I’m doing something, and Copilot isn’t smart enough to understand that.
I already copy boilerplate code verbatim from docs or stackoverflow. Also how would that work in closed source programs?
That code has predictable license terms. You have no idea what the license terms are for some random code that an AI plagiarized for you, because you have no idea where it even came from.
I don’t know, but Stac Electronics somehow figured out that Microsoft plagiarized their code, so it seems best to assume that I will also be caught.