GitHub Copilot is weird
Welcome back! Some of you may have heard about the new GitHub Copilot extension available for VS Code, although it’s an awesome step in the AI direction, it still has a lot of room for improvement. First off, one of the biggest issues that has been raised with this new piece of software is licensing. Now, GitHub Copilot does mention that all of the code they have trained their model on is public code on their website, but there could be many situations where the code that was trained may be licensed, if that ends up being the case, how would we know that the code we have is legal for us to use?
The image above is an example of how the output of this GitHub Copilot code looks like, how would someone even realize whether the code is licensed or not? I guess in this case we will have to trust GitHub and their machine learning model. There was another slight issue found in GitHub Copilot as well, specifically with built in referencing, let me explain. A Reddit user was able to see that a specific user is credited from a automatic generated piece of code from GitHub Copilot: