Decoding AI Assignment 2 Video + Blog

Video: Decoding AI final video assessment 2.mp4

The 3-minute video explainer that our group made explained AI animation generators and how they gather their data for training. This was a very interesting topic to look into because it involved having a lot of experiments with different animation generators and seeing the outcomes of all of them. I found that doing this assignment gave me a better understanding of the different topics that l can look at for my very own video explainer and has set me up with the information and research to do so. I feel that I have a better understanding of AI animation generators and how they work, but also putting in the right inputs to get the desired result that l want. The AI generator that l worked with the most was gooey.AI, testing out different prompts and changing a few words around to see the outcome that the AI would come up with, this is interesting because sometimes it would come up with what l typed into the prompt and other times it would come up with a completely unrelated image. I feel that l worked very well with my group, but it is best to share contact details very early so we can contact if anything has come up last minute or check on work progress with each other, l felt that l worked with Will and Chloe very well. I am very happy with the outcome of our work and it was a great effort put with all of us together, all having shared interest in the topic. I am excited to explore this topic more individually but also see what both Will and Chloe come up with.

Decoding AI Assignment 2 Blog #3

The third blog post should reflect on a reading and discussion topic from class from weeks 4 – 6 (you may choose one that you feel most drawn to). What were some insights gained from this reading? Draw connections between the reading, discussion and your own experiences/insights from the everyday.

A reading that l found interesting was the ‘Automation, Wellbeing and Digital Voice Assistants: Older people and Google devices’ journal entry. This is something that l deal with in my everyday life with family members and found it interesting to see this be talked about academically. The article discusses the impact of digital voice assistants, both positive and negative, and the effect that it has on older people; the way it affects their social interactions and overall quality of life. One of the main findings l was drawn to was the effect and impact Digital Voice Assistants have on the well-being of people who are using them. There was a study of a couple Helen and Ken in the journal and they stated their usage of the tablet helped with managing chronic pain in their life ‘The music sort of helps to relieve the pain to some extent. It’s better than painkillers sometimes, it makes you sleepy’ (Duque et al, 2021). A real-life connection l had to this was my grandfather before his passing, he was also in chronic pain but used his digital devices as a way to distract himself, whether it be video calling family members, to watching videos on Facebook to distract himself from how he was feeling. While this article does highlight some of the struggles and learning curves it also highlights the positive effects that DVAs have on the lives of older people.

 

Duque, M., Pink, S., Strengers, Y., Martin, R., & Nicholls, L. (2021). ‘Automation, wellbeing and Digital Voice Assistants: Older people and Google devices’, Convergence, 27(5), 1189-1206. DOI: https://doi.org/10.1177/13548565211038537

Decoding AI Assignment 2 Blog #2

The second post should reflect on your current research/work in progress for your individual video project. What are some potential ideas/topics you’d like to cover in your video explainer (e.g., if your video is about recommender algorithms, do you have a case study or example you want to include?) Make connections between your research and things you have encountered and observed in your own media engagement (e.g., have you read some research about social media algorithms that you can connect to your own social media use?)

For my video explainer, l think that l am starting to lead towards doing AI animation over AI art. I have had two main ideas in my mind; the first is covering the topic of AI within animated film and TV, how it is used, the programs, how it helps the workflow of pre and post-production and possibly what AI is being used for this. My second idea is more practical, during the research for assignment two, I discovered an AI animation generator where two keyframes are input and the AI generator makes the in-between frames doing the rest of the animating. For my video explainer l was thinking of comparing how the generator would work compared to hand-drawn animation, where l would create keyframes, put them into the animation generator see the outcome compared to the animation that l would hand draw myself, discover if it can improve or worsen an animation workflow, maybe even discovering if AI like these can teach somebody how to become a better animator. I feel that animation is such a wide topic to explore and this video explainer can go in multiple different directions. In my own media engagement, I interact with AI animation the most on Twitter/X as that is where there is a large rise in AI artists, and where my interest in this topic has come from as l see so much discourse on it every day. People are creating full animation scenes by inputting scenes from animated films claiming it as their work, which highlights the lack of creativity AI art truly has; this can also be brought up in my video explainer.

Decoding AI Assignment 2 Blog #1

The first post should reflect on the ‘genre’ and form of video explainers. How would you define a video explainer (can we call it a ‘genre’, or does it fit neatly into a ‘genre’)? From the examples we’ve watched in class, what are some of the aesthetics, modes or conventions that stand out to you? Choose an example (from class or your own research) and critically discuss those elements that you think work well/what you would do differently? What elements might you like to include in your individual video explainer (based on your assignment one proposal).

I feel that video explainers can be counted as their very own genre due to the large amount of videos and topics that are being made, but also multiple types of video explainers can be made. While researching video explainers l found that there are all kinds of video explainers – animated, whiteboard, live-action, infographic, etc… From the examples that we watched in class, l found the aesthetics of the videos to be interesting and eye-catching especially the VOX videos as they stuck with me the most. A favourite of mine was the VOX ‘How cruise ships got so big’; shot and edited in a way to keep the audience very interested but having animations and visuals to accompany the information was one of my favourite things about video explainers that l have discovered and would love to try to explore in my video explainer. I discovered that the thumbnails of video explainers have to be very eye-catching as that is what the viewer is most likely going to see first, VOX thumbnails, in particular, are always using some collage or mixed media, sometimes being more bizarre with their video ‘How The Conjuring became the Marvel of Horror’ with the thumbnail being Annabelle’s head photoshopped onto a famous marvel character, leaving the viewer amused or confused leading to clicking onto the video. Something that l found l would change in most that l watched is the pacing, sometimes the pacing can be fast finding it difficult to keep up with the information that is being told to us, l found for my video explainer l want to avoid doing this and want to have a good timing so it flows at a good speed and the information can be understood well by the viewers, and have interesting visuals to accompany the video contents.

Video Explainer Proposal

Artificial Intelligence has been enhancing quicker than we have seen before, but within creative industries is AI doing more harm than good? My video explainer will explore ‘How is Artificial Intelligence affecting the art and animation industry?’.

To begin breaking down this question, I need a deeper understanding of art and AI. ‘Art and the science of generative AI’ (Epstein et al. 2023:1) was a beneficial resource in understanding more about the world of art, the history of AI, legal issues, copyright, consent, and more; this gave me a starting point for my research and an understanding of a few different topics l could research about. I plan to investigate the usage of AI and how it affects the creatives within the industry. ‘It is not the AI that the artist is unhappy with, it is generative AI – instead of supporting artists in their process, it uses pattern recognition from artworks to remix something new. Without human artists, AI would have nothing to be trained on’ (O’Brien 2023). Generative AI is developing each day, becoming a big problem within the art world as it uses works without consent, leading to AI artists claiming generated art as their own, when it is extremely similar to the work of other artists; also cutting revenue from human artists as businesses start to use AI art for their needs. Animators are also having similar trouble, with programs like Sora, Krikey, etc… it is easier than ever for AI to create a moving image or animation. Surprisingly to artists, some animators optimistically view AI and invite the usage of AI within their workflow. Wow-How’s article ‘Revolutionising Storytelling: The Impact of AI in 2D/3D Animation’ has the said view of optimism, it mentions the positive uses of AI within production such as faster verdicts and usage of automation but does not fail to talk about AI’s setbacks of emotion and lack of creativity. (Bob, 2024)

In my video explainer, l will explain what AI is and the different types used for art and video generation, and possibly using examples made by myself compared to how an AI might make it. To keep audiences entertained, l plan to keep it visual and entertaining while trying to make it as informative as l can on how AI affects the industry and the possible future consequences.

 

Epstein Z, Hertzmann A, Herman L, Mahari R, Frank M.R., Groh M, Schroeder H, Smith A, Akten M, Fjeld J, Farid H, Leach N, Pentland A, Russakovsky O (2023) ‘Art and science of generative AI’, Science (American Association for the Advancement of Science), 380(6650):1-23. DOI: 10.1126/science.adh4451

O’Brien S (2023) ‘AI Art: Why Some Artists are Furious About AI-Produced Art’, IEEE Computer Society. https://www.computer.org/publications/tech-news/trends/artists-mad-at-ai

Bob (2024) ‘Revolutionising Storytelling: The Impact of AI in 2D/3D Animation’, Wow-How. https://wow-how.com/articles/impact-of-ai-in-animation

Reflective Blog Post #3

Reflect on the week three reading(s) and studio discussions around targeted ads and ‘dark’ advertising. What were some insights you gained? Was there an idea that was new or surprising, or something that struck you as particularly thought-provoking? Draw links between the readings/discussions to your own experience with targeted advertising (remember to make the distinction between ‘targeted’ ads and ‘dark’ ads). 

While l have been familiar with targeted ads, dark ads were unknown and l had never heard of them before. Both of this week’s readings have been extremely useful in fully understanding what a dark ad is, Ashkar Dave’s article about dark advertising coins it as ads that ‘disappear moments after they have been seen, and no one except the platforms know how, when, where or why the ads appear’ (Conversation, 2022). This is extremely different to targeted ads which are directed to a certain audience based on the data of what they have searched recently, dark ads are hyper-targeted and personalised to the person they appear to. What l found to be interesting is how these ads are even displayed and how they are untraceable to everyone except the person the ad is displayed to, whereas, with targeted advertising that l have experienced on Facebook, it is easy to find or see them a few times as they will show up every few scrolls between posts.

From our class l was extremely surprised to find out that Facebook is one of the leading platforms where these dark ads are visible to the audience, but also finding out how harmful some of them can be. In this week’s second reading ‘Shedding light on dark ads’ from Continuum, it mentions some of the harmful dark ads that were shown on the platform, using the 2016 US Election as its prime example. ‘The strategy relied on the ability to use Facebook to serve so-called ‘dark ads’ – non-public posts whose viewership the campaign controls – to targeted audiences’ Is a direct quote from the journal entry, also claiming to have run 50000-60000 dark ads a day on the platform (Trott et al. 2021:761). In my own experiences, I don’t think l have ever seen any dark advertising on Facebook that l know of; l have had multiple experiences with targeted advertising over a few social media that l have. I find that whenever l am searching for something l want to buy in a search engine, over the next hours or days l will start to see it advertised on my social media, for example, weeks ago l was looking to buy a new bed, and for the next week or two almost every ad that l was getting was about beds or bed-related items, this was on Facebook. On TikTok, advertising and targeted ads are different, l find the adverts that l get the most are for things that come up on my for you page, if something that is makeup-related comes up and l watch the TikTok in full, makeup ads will start coming up a few scrolls later.

 

Dave A (2022) ‘How dark is ‘dark advertising’? We audited Facebook, Google and other platforms to find out’, The Conversation. https://theconversation.com/how-dark-is-dark-advertising-we-audited-facebook-google-and-other-platforms-to-find-out-189310.

Trott V, Li N, Fordyce R, Andrejevic M (2021) ‘Shedding light on ‘dark’ ads’, Continuum, 35(5):761-774, DOI: 10.1080/10304312.2021.1983258.

Reflective Blog Post #2

After being introduced to ARC Centre for ADM+S: first, reflect on one area or project you find particularly interesting and why. Second, reflect on a connection to automated decision-making and AI that you hadn’t previously considered (e.g., the link between AI/ADM and Mobilities). 

https://www.admscentre.org.au/adm-ecosystems-and-multispecies-relationships/

ADM Ecosystems and multi-species relationships

The ARC Centre for ADM+S project l found interesting was the ADM Ecosystems and multi-species relationships, this project caught my eye as it focuses on the relationships between human and their environments compared to animals and their ecosystems. I found it interesting as they are researching human inventions such as delivery drones for groceries, meals and other goods, to digital bioacoustics. Upon reading digital bioacoustics, l was unsure of what it was exactly and did some research on it myself, it is a small digital recorder used to pick up sounds by animals that cannot be heard by human ears, the researchers then gather the data and sounds then use AI to interpret it into animal speech (Guardian, 2022). I found this extremely interesting as I’ve never heard of it before and find it fascinating how they use Automated Decision Making tools to compare the two very different ecosystems and environments. The Guardian article brought up a point that l had found very compelling in terms of digital bioacoustics, it mentions the possible risks of engaging, listening and understanding the conversations of different species. But the ARC Centre focuses its research on other aspects of this topic, with what we know of their findings and outcomes so far, the objectives so far are to display how environments and ecosystems evolve with the extremities of heat, droughts, floods and fires by creating a ADM+Ecosystem toolkit to resolve and come up with plans for these issues.

I never would have expected AI and ADM to be connected in researching environments, ecosystems and animal sounds. Upon reading the ARC Centre’s research on the topic of Ecosystems and multi-species relationships it made me realise that ADM and AI can be used for so much more than online systems but can be used in real-world situations. This week’s reading ‘How to wrench open the black box of algorithms that decide our fate’ from ABC News, explained visually the decisions made about us are being made for us. The black box can be used to make all sorts of conclusions about us from big choices such as Visas, insurance claims and more; to smaller decisions of map routes, films and music that we get recommended, etc (ABC 2022). Everyday choices get made for us without us even thinking about it or realising it, it is something that l, myself never personally considered but now that these ideas and concepts have been brought to light, l find myself seeing them everywhere and recognise how much Automated Decision Making is apart of our everyday lives.

 

Bakker K (2022) ‘Science is making it possible to ‘hear’ nature. It does more talking then we knew’, The Guardian. https://www.theguardian.com/commentisfree/2022/nov/30/science-hear-nature-digital-bioacoustics

Fell J, Spraggon B, Liddy M (2022) ‘How to wrench open the black box of algorithms that decide our fate’, ABC News. https://www.abc.net.au/news/2022-12-12/robodebt-algorithms-black-box-explainer/101215902

Reflective Blog Post #1

Reflect on your initial perceptions/understandings of AI and automated decision-making – before this studio, what came to mind when you heard these terms? Consider the ways you’ve encountered AI and automation in your everyday life? What do you hope to get out of this studio?

Before the first semester class, Artificial Intelligence was not something that l had thought about actively in my everyday life. Still, I knew it was there through the internet and my social media. The first thing that came to mind when asked about AI was ChatGPT, Spotify and Google’s search engine. The main reason why l thought of these things was with ChatGPT, l have heard about it from previous classes and social media. Spotify is an app that l use daily and l read an article not too long ago about how it uses AI to recommend songs to its users, create and change playlists daily and use previous listening data for its DJ, l had also thought of Google search engine because not too long ago when you would search up questions it would give an answer generated by AI, the answers would be posted on social media because of how bizarre they were. But Automated Decision Making was something that l had never heard about before and knew nothing about, after our first class l had come out with a basic understanding of what it is and what it does, and l thought it was intriguing that it can make so many decisions for us in our everyday lives. After our first week of classes, it has made me realise how much AI and ADM are used in our everyday lives, from streaming services such as Netflix, similar to Spotify using AI to recommend films and TV shows based on previous watches and what they think you would like; YouTube using AI to recommend the best videos for its viewers.

Being in this class, l am hoping to gain a better understanding of AI systems and how they work, but from learning about ADM this week, l want to learn more about how AI and ADM can affect our everyday lives and decisions. With the usage of AI getting larger with no end in sight, it will be useful to understand what it is and how it gets used to make big decisions. While l don’t know much about it now, every week l am hoping to have a better understanding of everything that we are learning about.

1 2