NETWORKED MEDIA: W3 – REFLECTION

NETWORKED MEDIA

WEEK 3 – THE NETWORK

Niederer, S 2018, Networked images: visual methodologies for the digital age. Amsterdam University of Applied Sciences, Amsterdam. (read pp.1-20)

Lister, M et al 2009, New Media: A Critical Introduction. Routledge, New York. (Sections: Networks, Users and Economics pp 163-169; Wiki Worlds and Web 2.0 pp 204-209; The Long Tail pp 197-200; User-generated content, we are all users now pp 221-232.)

This week we began to look at the concept of “The Network”

In the first reading of this week, Networked images: visual methodologies for the digital age, author Niederer takes us through networked images, visual methodologies and the surge of online video culture.

Niederer opens up the reading with the idea of Video Vortex – a network of visual artists, filmmakers and theorists interested in the surge of online visual culture, given the rise of YouTube in 2005, a new platform used for video sharing. The Video Vortex imagined a future visual online culture that imaginaries still resonate today, and there are many more visual platforms that have been launched (i.e. Tumblr, Instagram, Snapchat, Pinterest) becoming bigger than we have ever imagined.  Researchers who study online culture have noticed that much of the current concepts that surround online images and digital visual culture acknowledge a visual turn – also referred to as a pictorial turn.

Niederer states that many still focus on a theorisation of a single image and how the image actually becomes unstable through its digitalisation. This is where Niederer introduces us to the idea of the Visual Methodologies. Niederer explains that The Visual Methodologies program proposes a different entry point – entailing that these online images become “networked” when users like, share, comment or tag them, and also when platforms and engines format, filter, feed and recommend them to others. Images may also be networked across platforms through their circulation, when the same image is fed to or otherwise resonating on different platforms and websites. For any research, it means the researcher has to consider the different ways of demarcating content that go beyond the single image and take into account the entire network of related content – actors, platforms and websites that surrounds that content and its images. Neiderer also explains that the Interface is the opportunities present to the user by the Software, and each platform and engine handles images in distinct ways, thereby revealing platform and specific techniques. Both the networked-ness and techncity of online images call for an approach attuned to the medium. Taking networked-ness and technicity of context as a methodological entry point, it becomes clear that images should not be studied as seperate from their network. Neiderer references info designer Gabriele Colombo, who pointed out that much of the image analysis work done today starts with a folder of images along with information and metadata about its location, user engagement and other variables. This point of departure means that ongoing research questions cannot be answered through the study of only a single image stripped from its context. For example: We may need to study how images circulate, are engaged with, appropriated, made into memes and changed over time – This kind of research opens up important questions about both over and under-representation of certain works of art in the writing and circulating of content. Images have not only made it to the foreground of digital culture; they have also made a mark on a wide range of research practices. Niederer then proceeds to explain Visual Methodologies further. He says, captured by the term ‘Visual Methodologies’, such research and visual materials encompasses research and practices that include both the study of images and their interpretation and meaning-making and the use of images for research. Neiderer states the strands of research:

  • The first strand of research looks at four sites wherein the meanings of images are made: The site of production (how was the object made?), the site of the image or object itself (what does the image look like?), the site of its circulation (where and how does the image travel?) and the site of its audience (how is or how was the image seen and by which audiences?).
  • The second strand of research examines images as tools and instruments for research. Here we can situate methods like photo elicitation, which uses photographic imagery to evoke information, feelings and memories from its interviewee by presenting them with pictures and asking for their associations, or, reversely by presenting on issue or concept and asking for their associated images. Niederer references Gillian Rose, stating that she calls for a mixed methods approach, which with enable the researcher to explore in more depth and more details both the role and the meaning of their images, their audiences (who is included and excluded), and their circulation.

Neiderer then proceeds stating that the Programme of the Visual Methodologies Collective that is being developing in Amsterdam, zooms in on visual methodologies for the digital age. Neiderer then state that the demarcation  calls for a mix of disciplines and skills and knowledge that is necessary to do this kind of research, including digital research methods, design, (visual) storytelling, new media theories and concepts, programming, and critical making. The programme follows the same categories as introduced by Rose, with on one hand, the study of images, users and platforms.

Neiderer then ends this portion of the reading discussing image research, images as content and image research & platform vernaculars. When discussing image research, Neiderer states that we are moving from images as data to images at content. As Flickr had become a place of abundant image production and sharing, we, as users, only get to witness the occasional example: a picture posted by a friend or family member, perhaps. We no longer have the illusion that we can keep up fully with what is being shared. In 2012, Flicker already amounted to 350,000 per day, on average. In 2018, to mention a few visual platforms and their volumes of use:

  • Instagram: 95 Million per day
  • Facebook: 350 Million per day
  • Snapchat: 9,000 per second
  • YouTube: 300 Hours of Video per minute

A massive amount of visual content is also watched and shared by users, in fact content that includes image is much more likely to be shared/engaged within social media today:

  • Instagram and Snapchat users collectively watch 6 billion videos daily
  • YouTube users watch 5 billion videos per day

As those figures illustrate the ‘Pictorial Turn’ is not so much a theoretical shift from text to image, but rather a practice driven by users and facilitated by platforms, in which more and more users increasingly share visual content and engage with it. These practices have sparked a range of new methods for research practices as well. Neiderer explains that the Methodological entry point is that images can be studies through networked-ness. The diffusion of networked images also opens up a way to study affective publics that are rendered through shared sentiment, opinion or affect. Studying the ways in which images are repurposed (memes, filters and other visual textual elements) provides new insights into the dynamic user cultures of a particular platform. Platforms are not only carriers of content and channels of its distribution, but also the sites of image production and on entry point to visual research methods as well as novel conceptualisation. Looking to images as content, Neiderer states that research rooted in the arts and humanities can create speculative and experimental inquiries in software and its input, interfaces and output. This focus provides fertile ground for the study of digital culture and its artefacts. For the quantitive study of images as content, we can draw from the field of content analysis developed in communication science and known for obtrusive methods and conducive approach to all content types (test, image, sounds and audiovisual). Traditionally content analysis has focused on existing data sets such as a collection of TV broadcasts, the photographs in newspaper articles on a particular topic or a set of comic books. Digital media, however, can be published or created online and enriched with new opportunities for navigation and interaction. Digital media images can be placed in news articles that is networked through intext hyperlinks and recommendations to similar articles or pulled into social media using social media buttons. Lastly, Neiderer discusses image research and platform vernaculars. In addition to researching special collections on a particular issue or theme, research can also address how platforms as a whole may have a particular visual language. In line with “platform vernaculars”, which refers to the different narrative patterns that shape content and information flows across platforms. We can speak of visual vernaculars as having distinct visual patterns and practices for different platforms. Visual vernaculars research contrasts images from different platforms are contrasted as offering “windows” on a particular topic or issue. Neiderer finishes by stating that this approach offers researchers who critically think about the limitations of studying social media content, and rightly so, a productive way forward by asking what is this topic according to Twitter? What is it according to Instagram? Do they provide identical, similar, or distinct views and descriptions of the same topic? Such questions will help create an understanding of both the textual and visual vernaculars as well as the cultures of use for each platform.

Moving to the second reading for this week New Media: A Critical Introduction we focused in on sections: Networks, Users and Economics pp 163-169; Wiki Worlds and Web 2.0 pp 204-209; The Long Tail pp 197-200; User-generated content, we are all users now pp 221-232. Author Lister takes us through these concepts and breaks down exactly what they are.

Beginning with Networks, Users and Economics pp 163-169, Lister states that what we now understand to be the internet in general and the web, in specific, is the product of a number of factors:

  • It’s method of development has not been by design
  • Instead its protean identity is reproduced by a mix of fandom, community, commerce and business, linked by technologies that are both private and publicly owned and variously regulated, in other words;
  • The internet came into existence as a result of numerous factors, accidents, passions, collisions and tensions.
  • It’s ongoing development must therefore be seen as this section between economic and regulatory factors and the communicative practices discussed further in this section

Lister notes the way in which the desire the communication and pressures of commercialisation have interacted to bring us the Web2.0 and its expressions in the form of social networking sites; the ways in which the interaction between the culture of open source and commercially produced and protected software gives the development of networked digital media a distinctive character; the ways in which the development of the internet has not only given rise to new cultural practices that have actually become a threat to the interest and business practices of huge corporations but at the same time given rise to new media behemoths in online distribution, retailing and services – an obvious example of this is the way in which the ownership of intellectual property in media and the desire to protect that ownership competes with the ongoing enthusiasm of users to swap files via a myriad of technologies, some of them having developed in a particular way directing as a result of the need to get round legal prohibitions on earlier methods. Lister explains that its this type of interaction between enthusiasm, politics, commerce and technology that we wish to explore across the myriad forms of geeks, businessmen, students, housewives, children, adults, gamers and gardeners that make up the web. To put it simply, he states, we think to understand networked media it’s necessary to understand their developments as an ongoing product of tension between culture and commerce. Lister develops further that the internet simply describes the collection of networks that link computers and servers together, including the definition of the internet, as per Federal Networking Council (US, 1995):

  • ‘Internet’ refers to the global information system that is (i) logically linked together by a globally unique address space based in the ‘Internet Protocol (IP)’ or its subsequent extensions/follow -ons; (ii) is able to support communications using the ‘Transmission Control Protocol/Internet Protocol (TCP/IP)’ suite or its subsequent extensions/follow ons, and or other IP-compatible protocols; and (iii) provides users or makes accessible another publicly or privately high level services layered on the communications and related infrastructure described therein. 

Lister goes on to say that this primarily technical definition argues for an internet defined by the ways in which computers or able to send and receive data through the globally agreed protocols that permit computers to link together. The important aspect of such a definition is how minimal it is – the internet is here simply a means for computers to communicate in order to provide (undefined) ‘high level services’. The definition is intended to facilitate flow and exchange of date. Built into such a definition is the concept of ‘open architecture’ – there is no attempt here to prescribe how or where such data flows. Previous ‘mass media’ (eg. Newspapers, film, TV) were designed as systems to send messages from a centre to a periphery; here is a system designed from the outset to provide circulation of information. This ‘open architecture’ model was envisioned as early as 1962 by the visionary J.C.R. Licklider, who wrote a series of memos at MIT describing his ‘Galactic Network’ concept. Lister explains that Licklider became the first head of computer research for the Defence Advanced Research Project Agency (DARPA) in the US, and it was this Pentagon funded agency that eventually developed the protocols referred to above in order to allow computers to form networks that could send small packets of data to one another. This is where Lister introduces us to the history of the beginning of the internet, stating that, the internet society records the growth of computer-based communications from a system based round four hosts/server in 1969 to 200,000,000 hosts by 2002. These hosts supported an enormous variety of networks, all of which developed from the initial scientific and defence oriented networks of the original internet. These computer engineering histories determine much of the character of the internet as we experience it today – especially the idea of an open architecture. The internet has a history that stretches back to the second World War. The discursive, technological and economic developments of the internet all serve to shape our experience today. The history of the internet draws upon a range of approaches, some of which are synthesised as the study of ‘Computer Mediated Communication (CMC). The study of CMC has primarily developed as a socio-linguistic discipline based in communications theory and sociology. While there’s some overlap with media studies in a common concern for understanding forms of technologically mediated communication, it was for many years by no means clear how the internet was a medium in the same way as TV, film or photography were distinct in media. It has become increasingly clear that following Butler and Carusin’s model of remediation, as existing media find new distribution channels online, they in turn change their cultural form. Hybridising forms of new media emerge through the interaction between existing forms and the new distribution technologies of the next. All media producers now have to consider what TV executives call ‘360-degree programming’ – how a TV test will have online life, how audiences will be offered additional interactive experiences, how a media product might become ‘transmedial’ by generating revenue across a range of audiences and platforms linked by the internet marketing and distribution. Lister explains leading scholar Steven Jones (1994) summed up the inflated claims for the impact of what was then termed ‘Computer Mediated Communications (CMC)’; He observed that popular and critical writing claimed the net would:

  • Create opportunities for education and learning
  • Create new opportunities for participatory democracy
  • Establish countercultures on an unprecedented scale
  • Ensnarl already difficult legal matters concerning privacy, copyright and ethics
  • Restructure man/machine interaction

Lister further develops that in David Gaunletts review of ‘some of the main issues’ in 2004, display strong continuities with the fundamental issues identified by a previous generation of CMC research. Gauntlett summarises the research crew in the field as:

  1. The web allows people to express themselves (via putting up their own sites through social networks)
  2. Anonymity and play in cyberspace (Gauntlett extends the earlier CMC based work that seized on the possible anonymities of net based communications as a living embodiment of post-structuralist identity theory and asserts that is where queer can really come to life, because the internet breaks the connection between outward expressions of identity and the physical body)
  3. The web and the big business (Here Gauntlett makes the excellent point that throughout the early phase of net developments the dominant discourse on the economics of the web, was that business interests would destroy the culture of the web but that nowadays, the bigger panics run in the opposite direction -big business are scared that the internet will ruin them)
  4. The web is changing politics and international relations

Moving to Wiki Worlds and Web 2.0 pp 204-209 Lister introduces the us to the idea of the Web2.0 – a term coined in 2003 by media consultant Tim O’Reilly. The idea of Web2.0 is that a particular assemblage of software, hardware and sociality have brought about the widespread sense that there’s something qualitatively different about todays web. This shift is allegedly characterised by co-creativity, participation and openness, represented by softwares that support, for example, wiki based ways of creating and accessing knowledge, social networking sites, blogging, tagging and ‘mash-ups’. The web 2.0 has a clear economic goal; O’Reilly introduces the idea as a phoenix to resurrect the internet economy from the still smouldering ashes of the ‘dotcom bubble’ crash of 2000. This collapse marked some kind of turning point for the web, such as a call to action as the ‘Web2.0’.

Next up, in The Long Tail pp 197-200, Lister gives us a run down of the concept ‘The Long Tail’. One of the ways that the new dynamics of global economics both shapes and reflects the tension between economic determination and media cultures can be seen in the theory of Long Tail economics. Originally argues in Wired in October 2004 and then developed as a book in 2006, Lister explains Chris Anderson’s work on the Long Tail is one one of the most compelling accounts of the ways in which conventional media economics have changed in the post network cultures. The implications of the Long Tail analysis are far reaching, arguing that the economic basis of production is changing in ways that unlock market diversity on an unprecedented level. Anderson argues that the capacity of networked communications to connect with a multiplicity of niche markets ensures that lower volume products can attain a sustainable margin of profitability. Existing economics of the media have had two major characteristics:

  • One is the ‘hit driven’ economy – producers of (TV, film, music etc.) have to produce several flops or mid-ranking products to achieve the one hit that will sustain the enterprise
  • The other is the ‘first copy’ cost principle – that actually getting to produce the first copy of a newspaper or film are very high but thereafter margins of profits depend on the cost of distribution; in newspaper this had traditionally been good, each paper is cheap to print; in film each is expensive to make.

Lister goes on to state that successful mass media economics depended upon highly capitalised businesses able to spread the risks of the hit seeking market as well as mass produce products and get then to the the right kind of consumers. These conditions have had the effect of making media production a high cost, low volume products were unlikely to get made since they lingered relatively invisible in the ‘Long Tail’ of the demand curve.

Finally in User-generated content, we are all users now pp 221-232, Lister explains that one of the ways in which the internet has become to central to contemporary media culture has offered audiences participatory opportunities. The history of the take-up of these opportunities shows how the marginal possibilities offered by the net for audiences to interact with media is now refashioning the whole enterprise of what is termed the ‘Big Media’. Audiences have become ‘user’s and user-generated content has started to become a real competitor to traditional media, with the impact of the internet on traditional media institutions is stronger than anyone may have predicted. The growth of the blogosphere, the impact of peer to peer music distribution and the explosion of YouTube in 2006 have all challenged the foundations of news media industries. The traditional gatekeepers of culture, the filters of news and guardians of quality have all had to adjust to realities of participatory culture. Studies of fans and fan cultures have spearheaded the theoretical construction of this shift from ‘audiences’ to ‘user’ in media studies. The incursion of the ‘ordinary person’ into the bastions of media privilege is experienced as both opportunity and threat by the industries themselves and has been understood by academic researchers primarily through the history of active audience studies. ‘Fans’ were the first groups to avail themselves of the mass of websites material that exists in a symbiotic relationship with other media studies. This seamless lattice of mediation can be seen as the extension of mass mediation into more and more of our time, more and more of our space – it also brings within our reach the possibility of becoming producers in our own right. Every SNS post, or conversation in a chatroom, every home page and downloaded MP3 playlist facilitates the individual communicating in a pseudo public mode of address. What is clear is that a great deal of web use facilitates a feeling of participation in media space.

Leave a Reply

Your email address will not be published. Required fields are marked *