Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- Using AI in general will become a skill set, and some people will put in the "10000 hours" to master it while others will not.
- A lot of people probably think the process is that you just type in random prompts and keep pushing the "generate" button until purely at random, you get a "good output" and you're completely done. They think this, because they've only dabbled in the process, and that's how *beginners* do it, so they think that's how everyone else does it. This is a Dunning-Kruger moment.
- My experience is with ChatGPT, and you get better at using the AI by learning what types of prompts work, what doesn't, and by adding supplemental prompts which tweak the work in specific ways, to get it to produce the right type of output, tweak words here and there, add or remove stuff and regenerate parts that are missing. So, over dozens of ChatGPT projects, I've gradually refined what I know I can use ChatGPT to create.
- Even then, that's not a "finished work". You need to take the snippets of text that it outputs then adapt them into the medium that they're intended for. This part requires sufficient domain knowledge of the medium you're actually creating for.
- For example, I use ChatGPT to wring song lyrics (my weak point), tweak it in the AI until I'm happy with it, then adapt the song with actual chords, play it on guitar and sing the song. For this process I need a large amount of pre-existing domain knowledge for the type of thing I want to create, in additional to the new skills of coaxing ChatGPT to create usable output, then take that output and transform it into another medium.
- Has this stifled my personal creativity? I have several new songs just about ready to record because of this.
- Another example is getting ChatGPT to write simple Python computer games. ChatGPT can handle a lot of the grunt work, and you can get it to output a complete working program without having written a single line of code yourself. But you need to have a sufficient amount of "domain knowledge" of the type of game you're going to create, ability to read and understand the Python code that ChatGPT has output to tell if it does what you want so that you can write meaningful prompts to change it in the correct ways, ability to test the code, detect bugs and then re-prompt ChatGPT in just the right way that the bug gets fixed. So that's a lot of pre-existing domain knowledge needed on top of the new skill set of learning how ChatGPT "thinks".
- So while I haven't done any more than dabbling in the image-AI stuff, I can totally see the truth in some of those creators when they claim they DON'T just put random prompts in and hope they get something nice out, and that they have in fact spent 100's of hours working in the tools to have the knowledge to create the high-quality stuff. There's also the fact that raw images aren't really usable in many contexts.
- A lot of the time, images are used in some type of composite work in another medium. So I can see AI image output being used in design work, to speed up some stages, then the designer needs to rework the output into a new form, similar to what I do with the lyrics and Python code that ChatGPT produces. So my actual long-term view is that people don't in fact become more "lazy" because the AI will do it for them, they just spend the "10000 hours" learning stuff *on top* of what AI can do.
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement