There is a saying in computer class: "garbage in, garbage out"—"GIGO'. Well, we have arrived at a point in computer science that allows us to use the computer to perform just about any marketing task that was once done by a human. It is neat, super-efficient, and, most importantly, cheap. Having a robot do the job of a human is the stuff of science fiction, but in 2023, it is a reality and will become extremely lucrative.

Adobe recently announced generative artwork as a service. For a certain amount of credit, you can use their computer to create the visuals you request in writing. This is no different from a request made to the computer service called ChatGPT. In Adobe's case, the generative services eliminate the need for a photographer. This is where 'GIGO' comes in—AI is only as good as the input it receives. Although AI can learn and improve, it still requires source material. In photography, the source material is a fundamental weakness. If there is a bias, it will show and shine brightly for all to see. In the early systems of facial recognition, certain classes were excluded.

AI marketing is helpful, but one must always remember that it might not be representative enough to make informed decisions. To emphasise how data collection and its interpretation can introduce bias, let's look to July 15th, 2017 (a national survey says overwhelming confidence). The data was 100% accurate, but the outcome did not accurately represent the market it was supposed to reflect; the data effectively lied because, on July 18th, the confidence was recognised as biassed information. This is what AI tends to do from time to time because it is instructed to fill in the gaps. However, the gaps in data collection, especially in photos, can result in some photos having six fingers or glasses through the ear. The source was accurate, but the instructions or data analysis were flawed, and as a result, the photo was flawed and, in some cases, deeply offensive.