The technology twitterverse is abuzz with GPT-3 experiments. Even having high-level understanding of the model, I’ve been taken aback at some of the things that people have been able to do with it. I’m not at all surprised that it’s been able to do a
very good job of generating prose, but
writing syntactically correct HTML and SQL that accomplishes a user’s natural language intent? That…was unexpected.
The below two posts are the best overviews that I’ve seen, and I want to include them here because if you haven’t familiarized yourself with this topic I think it’s important that you do. This feels important. Even with the current (obviously imperfect) version, there will be real production use cases and plenty of fascinating and unexpected downstream results.
I highly recommend spending some time diving into this, and these are two good starting points.