The IT Canvas Bearbeiten

The perpetual interplay of Information Technology and Creativity

The following paper will examine the intertwined relationship of creativity and Information Technology (IT), starting with the most recent phase of electronic IT development in the 1940s. The paper's "Rewind" section will attempt to show key developments in the realm of IT over the preceding half a century up to the modern day. IT developments that have helped artists and entrepreneurs introduce novel ideas into the public domain will be given close examination, as will those forms of IT that have been invented with the specific purpose of realising unique artistic concepts and business challenges. The "Play" section will seek to address the forms of IT currently assisting artists and entrepreneurs create previously unknown forms of entertainment and business models. Finally, the "Fast Forward" section will give an overview of selected trends and technologies that industry observers have suggested will shape the future of IT development. This chronological examination of how IT has assisted and continues to support and stimulate the dream realisation process should demonstrate that no great idea, whether it be artistic or commercial, can reach the world stage without a robust IT canvas supporting it.



Creativity and its Role in the History of IT Development

Human beings are intrinsically creative. Creativity is the ability or power to create, produce or express something characterized by originality and imagination. It is a human activity that seeks to fill a necessity, improve a situation, or materialize one’s dreams. Creativity applies in all areas of learning, business and personal communication and of course the arts. It is for everybody and not just the few who are extremely talented [1]. One of the most concrete displays of human creativity is seen in the continuous development of Information Technology (IT). Starting with a brief overview of how IT has evolved since the 1940s, this section shall seek to show how creativity was not only present at every phase of IT evolvement, but continually served to fast track the ongoing development of technology. The history of IT consists of four distinct development phases: the Pre-mechanical Age (3000 B.C. to 1450 A.D.), the Mechanical Age (1450 – 1840), the Electromechanical Age (1840 – 1940), and the Electronic Age (1940 – present). These periods are basically characterized by a principal technology used to solve the input, processing, output and communication problems of the time[2]. In this brief chronological overview, the focus will lie on how creativity was involved in every technological innovation from the principle invention of the early electronic age – the computer.

The Development of the Computer

In 1937, Bell Lab’s George Stibitz created a relay-based calculator, the model named after the kitchen table on which he built it. Stibitz then led a team that produced the Complex Number Calculator which was capable of performing calculations on complex numbers. In 1941, The German inventor and businessman Konrad Zuse developed the Z3 – the first fully automatic, programmable computing device using 2,300 relays and a 22-bit word length. In the same year, Great Britain completed the first Bombe enabling Allied forces to decrypt Nazi communications in World War II. British computer pioneer Alan Turing creatively designed the Bombe using a technique known as cribbing, which assumes that a message will contain some text that analysts will be able to guess. With important contributions from other scientists, the Bombes were crucial to the Allied intelligence gathering processes[3]. The Electronic Numeric Integrator and Computer (ENIAC) was originally intended to calculate artillery-firing positions for the US Army during World War II. Instead, it was first used by designers in the calculations for the hydrogen bomb program. The ENIAC, considered today as the first general-purpose computer, was 1,000 times faster than contemporary machines and dubbed as “the Giant Brain” in its public unveiling in 1946. IBM’s Selective Sequence Electronic Calculator (SSEC) was the first operating computer in 1948 to use both electronic computation and run stored programs. NASA later used moon-position tables based on SSEC outputs for the Apollo 11 moon landing mission[4].

The first computer used for commercial business applications, as well as Great Britain’s first commercial computer, was the LEO, created in 1951 by the Lyons Tea Co. to address scheduling problems with the daily production and delivery of cakes to Lyons tea shops. In 1956, Lyons began using the LEO to calculate payroll for Ford UK and other companies — one of the first instances of process outsourcing. Lyons eventually began building data processing computers and merged with the English Electric Company (EELM) to manufacture the LEO computers. The Semi-Automatic Ground Environment (SAGE) was developed to track and intercept enemy aircraft for some three decades, starting from the 1950s. It was the first large-scale computer communications network and led to innovations in real-time computing, interactive computing, and online systems. IBM, a contractor on SAGE at the time, developed the AN/FSQ-7 computer, considered as a key factor in the IBM’s success in the computer industry[5]. In 1974, the XEROX Palo Alto Research Center (PARC) designed the Alto which was the first computer to use a ‘desktop’, the first to use a graphical user interface (GUI), and the first to use a mouse. It featured, among other innovations, an email tool (the Laurel), the first WYSIWIG tools (Bravo and Gypsy), and an early paint program (Markup). The Osborne 1, released in 1981, is considered to be the first laptop computer, weighing 24 pounds, with a 5-inch monitor, modem port, two 5 1/4 floppy drives, a bundle of software programs and a battery pack. As a portable and hardwearing computer that closed up for protection with a carrying handle, it was creatively designed to survive being moved about[6].

The first successful computer to use a mouse and a GUI rather than a command-line interface was the 1984 Macintosh. The Macintosh 128k’s look and feel was heavily influenced by XEROX PARC’s groundbreaking GUI technology. It featured two applications: MacPaint and MacWrite, which used WYSIWIG word processing[7]. In 1985, the "Macintosh" computer line received a big sales boost with the introduction of the LaserWriter printer and Aldus PageMaker, which made home desktop publishing possible[8]. In 1985, Steve Jobs created NeXT Inc. and set to create a new programming environment that would be object-oriented, meaning that programs could share information and features, and would integrate features only available then on high-end workstations and Macs including a WYSIWYG editor – an intuitive user interface and a fully multi-tasking operating system. Jobs recruited a major developer to help create the new operating system dubbed NeXTstep. Since the NeXTstep operating system had a very large hard drive and would be very expensive, the hardware engineers decided to adopt the magneto-optical (MO) format which was substantially faster than floppy disks. And as Jobs wanted “what the user saw on the display to exactly mimic the printed page”, NeXT worked on porting Adobe’s PostScript to NeXTstep, and in the process created the Display PostScript. The new computer’s case was unique, using a 12" magnesium cube. The NeXT cube computer was publicly released in 1988 and each package included a 17" megapixel grayscale display (1120 x 832 pixels with four shades of gray), a 400 dpi laser printer, and built-in Ethernet networking[9]. It was also the first personal computer to include a drive for an optical storage disk and voice recognition technology[10].

The Rise of the Internet

The history of the Internet is considered to have started in 1957 when the USSR launched Sputnik, the first artificial earth satellite. In response, the United States formed the Advanced Research Projects Agency (ARPA) to establish a US lead in science and technology applicable to the military. The US Air Force commissioned Paul Baran of RAND Corporation in 1962 to do a study on how the USAF could maintain its command and control over its missiles and bombers after a nuclear attack. The idea was to have a decentralized military research network that could survive a nuclear strike, so that if any locations were attacked, the military could still have control of nuclear arms for a counter-attack. Baran's final proposal was a packet-switched network. "Packet switching is the breaking down of data into datagrams or packets that are labeled to indicate the origin and the destination of the information and the forwarding of these packets from one computer to another computer until the information arrives at its final destination computer. This was crucial to the realization of a computer network. If packets are lost at any given point, the message can be resent by the originator."[11] The first e-mail program was created in 1972. The Network Control Protocol (NCP) was then used to transfer data which allowed communications between hosts running on the same network.

In 1973, development began on TCP/IP protocol which allowed diverse computer networks to interconnect and communicate with each other[12]. The development of the Ethernet in the 1970s led to the exchange of data between computers in a standardized and high-speed manner. The Domain Name System (DNS), created in 1983, made it much easier for people to access other computers. Tim Berners-Lee, a scientist at CERN, the European Organization for Nuclear Research, developed the HTML which made a large impact on how we navigate and view the Internet today. The World Wide Web (WWW) was introduced in 1991 and the first audio and video files were distributed over the Internet the following year. The Mosaic web browser, a graphical user interface to the WWW, was developed in 1993[13]. and eBay went live in 1994, largely transforming the Internet during this period into a commercial enterprise. The Google search engine was born in 1998, which in turn brought a whole new perspective to the use of the internet. In 1999, a wireless technology more commonly referred to as Wi-Fi, was standardized. This technology appeared as a built-in feature of portable computers and many handheld devices in the years that followed. Facebook debuted in 2004 and opened the era of social networking[14]. One year later YouTube was launched, which offered a free video-sharing environment for everyone with access to a computer and broadband internet.


The widespread use of personal computers and the Internet has coincided with the rise of E-Commerce. E-commerce or Electronic Commerce, is the buying and selling of products or services via the Internet. With E-commerce, consumers are able to research the market, obtain quotes, make comparisons and conduct online purchase in the comfort of their homes. The Electronic Data Interchange (EDI) paved the way to what came to be practiced as e-Commerce. EDI was first introduced in the 1960s and replaced the traditional mailing and faxing of documents with a digital data transfer from one computer to another[15]. The EDI also helped Electronic Funds Transfer (EFT) and enabled companies to electronically send and receive commercial documents. This further led to other forms of electronic money transactions like the automatic teller machines (ATM), the use of credit cards and telephonic banking[16].

The early development in online shopping was credited to Michael Aldrich, an English inventor and entrepreneur. He got the idea while having a stroll with his wife and lamenting to her about their weekly supermarket shopping expedition. That conversation triggered the creative idea to link a television to their supermarket to deliver the groceries. In 1979, he connected a television set to a transaction processing computer with a telephone line and created what he coined, “teleshopping,” which means shopping at a distance[17]. In 1982, France launched the Minitel – a successful pre-world-wide-web online service. This service used a Videotex terminal accessible through telephone lines, and functioned as an end-user information system displaying text or information on a television. The Minitel system then gradually fell out of favour following the subsequent success of the Internet[18]. Initially, there were many concerns regarding online shopping, but the development of the Secure Socket Layers (SSL) by Netscape in 1994 provided security to the transmission of data via the Internet. Web browsers are able to check and identify whether a site has an authenticated SSL certificate and determine whether or not it can be trusted. This encryption protocol is a vital part of web security today, and this creative approach to alleviating consumer's concerns about security has helped boost online sales by 2% in the US e-commerce sector from $157 billion to $218 billion, so that it now comprises 10% of the total US retail market according to a Forrester report[19].



IT and the creative way of doing business: E-commerce

The growth of E-commerce as a new business model has its roots in business creativity. When it first came to public awareness, E-commerce was to be a New World where old economy conventions were disdained and rule books rejected, where old truths about management techniques as well as office protocol no longer applied. E-commerce draws on information technologies such as mobile commerce, electronic funds transfer, supply chain management, Internet marketing, online transaction processing, electronic data interchange (EDI), inventory management systems, and automated data collection systems to be carried out. Modern e-commerce typically uses the World Wide Web at least at one point in the transaction's life-cycle, although it may encompass a wider range of technologies such as e-mail, mobile devices social media, and telephones as well[20].

When speaking about innovative electronic business (e-business) organizations, we normally think of web pioneers such as Amazon[21]. Amazon is well known for is its ability to collect, analyze, and reuse marketing data. It relies on high-quality data in its database and uses business intelligence tools to analyze the data in support of business decisions[22]. The company furthermore records data on customer buyer behavior, which then enables them to offer or recommend a specific item, or bundles of items, to an individual based upon preferences demonstrated through purchases or items visited. On January 31, 2013 Amazon experienced an outage that lasted approximately 49 minutes, which caused chaos and underlined the indispensability of IT in this new and innovative way of doing business[23].

IT and Artists

Commerce, however, is obviously not the only area where IT can play a pivitol role in the success or failure of an idea. IT and the arts also can interact via the use of technology to extend the expressive range and modes of access to existing genres of the arts. One example of this is the usage of the Apple iPad in painting. David Hockney was an important contributor to the Pop art movement of the 1960s and is considered one of the most influential British artists of the 20th century[24]. Since 2009, Hockney has painted hundreds of portraits, still lifes and landscapes using the iPad Brushes application. Hockney once said that Van Gogh would have loved the iPad[25]. His show Fleurs fraîches (Fresh flowers) was held at La Fondation Pierre Bergé in Paris. A Fresh-Flowers exhibit opened in 2011 at the Royal Ontario Museum in Toronto, featuring more than 100 of his drawings on 25 iPads and 20 iPods. In late 2011, Hockney revisited California to paint Yosemite National Park on his iPad. For the season 2012/2013 in the Vienna State Opera, he designed on his iPad a large scale picture (176 sqm) as part of the exhibition series "Safety Curtain"[26].

IT and the Entertainment Industry

IT also aides musicians, in that digital technology allows composers to create their own instruments and a performance using only the computer. Computer-driven pianos now replicate a performance with many of the nuances of the original pianist’s keystrokes. Interactive computer programs can participate in the performance, producing and manipulating sounds in response to a performer’s actions. Digital technology, through interactivity, has helped bring spontaneity back into the performance of electronic music[27]. And from the score to the script, the computer plays an equally pivotal role in filmmaking. From the popular Final Draft screenwriting program through to pre-production planning to picture and sound editing, computer skills and applications are essential for the final product's success[28]. Nowadays computer generated imagery (CGI) offers many technical and creative ways to realize a filmmaker's vision for their motion picture project. The creation of new creatures and just about anything coming from the artist's imagination can be realized using 3D CGI. From mythical lifeforms to inanimate objects, they can all be brought to life using 3D CGI[29].

Toy Story pioneered the use of 3D CGI. Soon after, other large scale productions, the most notable being James Cameron's Avatar, captured the public imagination via the use of various innovative visual effects including a new system for lighting massive areas like Pandora's jungle, a motion-capture stage six times larger than any previously used, and an improved method of capturing facial expressions to enable full life-like replication. To achieve this face capturing, actors wore individually made skull caps fitted with a tiny camera positioned in front of their faces. The information collected about their facial expressions and eyes was then transmitted to computers. According to Cameron, the method allowed the filmmakers to transfer 100% of the actors' physical performances to their digital counterparts[30]. Also when we consider gaming as a component of the entertainment industry, we can point to research conducted by Michigan State University which looked into the relationship between children’s IT usage and their creativity. The researchers found that kids who play more video games tend to be more creative[31]. They defined creativity, in this case, as a mental process involving the generation of new ideas or concepts, or new associations between existing ideas or concepts[32].

IT and Design

It should be mentioned here that computer graphics, computer-aided design, interaction design, and virtual environment building are not just the domain of computer games. These forms of technology also enable a lot of traditional creative practices[33]. Architects can use computer-aided design/manufacturing systems to design and build forms impossible in the past[34]. Computer-aided design (CAD) is the use of computer systems to assist in the creation, modification, analysis, or optimization of a design. CAD software is used to increase the productivity of the designer, improve the quality of design, improve communications through documentation, and create a database for manufacturing[35]. Architect Ian Ritchie used the CAD system to help design his innovative gallery in the Natural History Museum in London. He claimed that he was able to generate a more sophisticated 3D form for this than he would have been able to in the same time using conventional manual drawing, and that he would not have tried to use such complex forms without the CAD system[36].

Finally, a new development of information technology-based design method is 3D printing, which is a process of making a three-dimensional solid object of virtually any shape from a digital model[37]. 3D printing starts with a blueprint, usually one created with a CAD program running on a desktop computer. This is a virtual 3D model of an object. CAD programs are widely used today by designers, engineer, and architects to render physical objects before they are created in the real world[38]. This technology can be used in industrial design, architecture, engineering and construction, automotive, aerospace, dental and medical industries, education, geographic information systems, civil engineering, jewelry, footwear and other forms of fashion design[39], a field that is in itself rapidly changing in the face of emerging information technology. Software can help fashion designers draw, create woven textures, drape models to create patterns, adjust sizes and even determine fabric colors[40]. "Fast fashion" is a new fashion business model brought about by IT developments. The idea here is that designs move from the catwalk to the store in the fastest time to take advantage of current trends. Fashion retailer Zara has optimized its supply chain to allow it to design, produce and deliver a garment within 15 days[41]. A even more pioneering fashion marketing model is Retail 2.0, where IT connects fashion storefronts with no inventory but just the ability to order the item and have it delivered.


Fast Forward:

The IT infrastructure required to support such developments is manifold and complex, but it also shows how modern computing technology can influence the commercial and artistic actions of entrepreneurs and artists, particularly once such developments are de rigueur and not in their infancy. The following closing analysis will provide some selected prognostications by futurists and industry observers in order to further demonstrate how the implementation of any new ‘global’ idea necessitates a robust IT canvas to support it in the future as like now. Due to the broadness of what IT means in light of the fields in which it has been examined in this paper, namely commerce and the arts (also quite broad topics in themselves), only what we deem to be the key trends related to each field shall be examined. These are the trends the authors believe will have a strong impact on the respective developments in IT required in the future, which in turn will influence what artists and entrepreneurs can do with it. Gamification shall form the fulcrum of examination of creativity in the commercial sense, while tablet-based expressions of art and the continued rise of social media and its IT backbone shall be the principle examples of ways Information Technology is expected to promulgate creativity in the decade ahead.


One IT trend growing in relevance to business is the concept of gamification, which could be described as the process of motivating people to act in the real world through the medium of games supported by a robust information technology infrastructure. This development has come about thanks to a rapid development of mobile and Internet-enabled technologies over the last ten years that has enabled people to create, participate and play in new ways[42]. As Forbes business magazine contributor Brian Burke puts it, this legitimization of game play for adults is being leveraged to improve “customer engagement, employee performance, training and education, innovation management, personal development, sustainability, health and wellness” [43] to name a few areas of commerce. One example of a successful implementation of gamification in the office space is the UK’s Department of Work and Pensions’ Idea Street incentive to encourage its workforce to come up with new ideas via a collaborative social game. An online points table with associated game mechanics allows employees to submit ideas as part of the organization’s overall innovation management[44]. Moving forward, this general trend could see up to 40 percent of Global 1000 organizations use gamification as the primary mechanism to transform business operations by 2015 according to a prognosis carried out by Gartner, Inc.[45].

Gamification is also making its way into other forms of recreational activities such as reading and art, with traditional medium artists such as David Hockney, as we have seen, using the new technologies to generate their works. Interestingly, however, information technology is also found where business and art combines, such as in the case of startups developing apps that blend the reading process with animation and music in ways unlike traditional computer games, television shows, books or comics. Known by a variety of names, these new apps are fundamentally a hybridization of different kinds of art that draw upon the various functions innate to modern incarnations of the tablet PC – those light, consumption-focused handheld devices like the Apple iPad and Galaxy Tab from Samsung – to create a new audiovisual experience for users. A good example here is the work Wivern Digital is publishing as part of their Addison’s Tales digital book label. The digital book label concept involves drawing on the benefits of one of the earliest forms of mechanical IT, namely the book, and combining this traditional reading process with the more recent audiovisual developments like animation, music and gaming.

The Road Ahead

The truly innovative development, however, is the user’s ability to shape the direction of what is occurring in the written story through touch interaction and the tablet’s proximity sensors and ‘accelerometer’ – the device that detects motion, shock etc. A specific artistic innovation Wivern Digital has been able to implement thanks the tablet’s IT architecture is their Enlivened Rhymes, which are essentially a bundling of animation and buyable music at certain scenes in a book to give readers the chance to glimpse into the imaginations of the characters they are reading about[46]. It is similar to the ‘flashback’ literary device used in scripts, game and novel writing, but where the flashbacks are now actually portals into unwritten aspects of the storyline that are embedded as musical interludes within the digital pages of a storybook.

However, this example illustrates only one possible way IT helps define how the artist, or writer in this case, can create for the mobile device as a canvas with user consumption as the end goal. There are also numerous examples of apps that can allow users to create art with the tablet or smartphone itself, ranging from the iPad Brushes app, right up to something as futuristic as creating your own 3D image in the air with the Airmarkr iPhone app and receiving it as a 3D print out. The underlying IT that allows for both artistic consumption and creation is comprised of various elements, of course, including the ARM architecture found in Google’s Android-based tablets as well as Apple iPads’ A4 and A5 processors. The mobile operating systems (MOS) used in the respective tablet devices also act as the ‘enabling link’ between the tactile interactions of the user with the app and the hardware they hold in their hands. It could be argued that creating the language required to correctly code for the relevant MOS to ensure a smooth user experience during this human-machine interaction is an art form in itself.

To this end, Adobe ActionScript (based on Java), native C and C++ are the languages likely to remain the most widely used to create apps for the Android MOS [47], while Objective C is still required for developments to be approved by Apple. C++ is a compiled language also used by Facebook – the world’s largest social networking site and the focus of this the final part of this analysis. One of the key features and attractions of Facebook is the platform’s ability to instantly show terabyte’s worth of data stored in text and JPEG format across its network, namely the comments and photos that users share with those in their network or the world at large (if a company site is in question, or indeed, the user has chosen to make his posts public). This robustness, the free nature of the service for users, plus the extensive reach Facebook now has, naturally provides artists with a direct and very affordable forum to present their works to potentially millions of people. This process is made possible via Haystack, an ad-hoc storage solution developed by Facebook, which stores the billions of photos uploaded by users[48], along with BigPipe, their “custom technology to accelerate page rendering using a pipelining logic”[49].

All this hidden technology underlying the simplicity of the ‘like’ concept and process allows for digital renditions of works that hold a particular appeal for a particular subset of people to be shared with others in a split second. Through this process and the IT behind it, Facebook can be seen an international forum or gallery that can be monetized in numerous creative ways, including Kiosked, which allows artists to ‘tag’ their digital merchandise anywhere online (including in games, videos and of course Facebook or blog pictures), or Sellaround, which is a ‘shop’ widget that can be installed on a Facebook page or website, just to name two recent startups. Essentially the name of the game is still the same – the artists needs a middle man or forum to sell their works, but by using these web services, the barriers to market entry are now defined primarily to whether one has internet access and a computer.

This touches on two key trends predicted to shape the IT landscape in both the fields of commerce and art with which I would like to conclude: micro-incomers and the cashless society, both of which require existing and to-be-developed technologies to support them. The proliferation of smartphones and tablets has also coincided with a rise in startups like Square and its various clones offering payment processing capabilities via a card reader that is attached to the mobile device. It enables traders to accept credit and debit card payment on the spot and has the ability to sharply reduce the necessity to carry cash if the early adopters are supported by a general acceptance of the concept. It also has the potential to create “a data-driven eco-system of rewards, purchase history, daily-deals and more,” according to Google futurist Thomas Frey, who goes on to explain that Near Field Communication is the technology that allows for encrypted data to be exchanged between two devices in close proximity[50]. Frey has also commented on the coming age of micro-incomers – those people who can make a living working or selling their skills, creations and services online. Both trends can be harnessed by artists and entrepreneurs alike in new and innovative concepts and platforms, the scope of which can only be leveraged by a sufficiently robust IT architecture scalable enough to support increasing usage rates as the trend shifts to being commonplace.


Creativity manifests itself in multiple fields and contexts. These manifestations vary in form and character, in associated terminology, and in the types of benefits that result. In science and mathematics, the most fundamental outcome of creative intellectual effort is important new knowledge. In engineering and other technology-based industries, creativity yields technological inventions. Such inventions can result in commercially successful products, in improvements to the quality of life (as, for example, when technology enables a new form of entertainment like social networking or creating on tablet devices). An essential component of economic success is to ensure the ideas and the talents of those who work with such technologies can be captured in innovative ways, so that the products, applications and services that shape and reshape our lives can come into being. This paper has sought to highlight just some ways IT supports this continually interweaving process of invention, application, invention and application ad infinitum.

References Bearbeiten


  1. What Is Creativity, Education and Learning, Derbyshire County Council, Abrufdatum: 27.2.2013
  2. A History of Information Technology and Systems, Jeremy G. Butler Abrufdatum: 07.03.2013
  3. 14 Key Points in Computer Development Since 1940, Larry Murray, Abrufdatum: 07.03.2013
  4. 14 Key Points in Computer Development Since 1940, Larry Murray, Abrufdatum: 07.03.2013
  5. 14 Key Points in Computer Development Since 1940, Larry Murray, Abrufdatum: 07.03.2013
  6. 14 Key Points in Computer Development Since 1940, Larry Murray, Abrufdatum: 07.03.2013
  7. 14 Key Points in Computer Development Since 1940, Larry Murray, Abrufdatum: 07.03.2013
  8. Inventors of the Modern Computer, Mary Bellis Abrufdatum: 10.03.13
  9. Full Circle: A Brief History of NeXT, Tom Hormby, Abrufdatum: 07.03.2013
  10. 14 Key Points in Computer Development Since 1940, Larry Murray, Abrufdatum: 07.03.2013
  11. Full Circle: The History of the Internet, Dave Kristula, Abrufdatum: 08.03.2013
  12. Internet History Timeline: ARPANET to the World Wide Web, Kim Zimmermann, Abrufdatum: 08.03.2013
  13. Internet History Timeline: ARPANET to the World Wide Web, Kim Zimmermann, Abrufdatum: 08.03.2013
  14. Internet History Timeline: ARPANET to the World Wide Web, Kim Zimmermann, Abrufdatum: 08.03.2013
  15. The History of Ecommerce: How Did It All Begin? Miva Merchant, Abrufdatum: 08.03.2013
  16. History of E-Commerce, Preetam Kaushik, Abrufdatum: 08.03.2013
  17. The History of Ecommerce: How Did It All Begin? Miva Merchant, Abrufdatum: 08.03.2013
  18. The History of Ecommerce: How Did It All Begin? Miva Merchant, Abrufdatum: 08.03.2013
  19. US Online Retail Forecast, 2010 To 2015, Sucharita Mulpuru et al, Abrufdatum: 10.03.2013
  20. E-commerce, Wikipedia, Abrufdatum: 08.03.2013
  21. Caterpillar moves to revamp supply-chain operations via the Web, Marc L. Songini, Abrufdatum: 08.03.2013
  22. Business Processes and Information Technology, Ulric J. Gelinas, Jr. , Abrufdatum: 08.03.2013
  23., Abrufdatum: 08.03.2013
  24. David Hockney RA: A Bigger Picture, Royal Academy of Arts, Abrufdatum: 08.03.2013
  25. David Hockney's iPad art, Martin Gayford, Abrufdatum: 08.03.2013
  26. David Hockney’s IPad Doodles Resemble High-Tech Stained Glass, Martin Gayford, Abrufdatum: 08.03.2013
  27. Beyond Productivity: Information, Technology, Innovation, and Creativity (2003), National Research Council (US), Abrufdatum: 08.03.2013
  28. Apple Computer Requirement, School of Filmmaking, UNCSA, Abrufdatum: 08.03.2013
  29. Advantages of Using 3D CGI Programs in Filmmaking, Rianne Hill Soriano, Abrufdatum: 08.03.2013
  30. Avatar (2009 film), Wikipedia, Abrufdatum: 14.03.2013
  31. Information technology use and creativity: Findings from the Children and Technology Project, Linda A. Jackson, Abrufdatum: 14.03.2013
  32. Information technology use and creativity: Findings from the Children and Technology Project, Linda A. Jackson, Abrufdatum: 14.03.2013
  33. Beyond Productivity: Information, Technology, Innovation, and Creativity (2003) , the Committee on Information Technology and Creativity, National Research Council, Abrufdatum: 08.03.2013
  34. Beyond Productivity: Information, Technology, Innovation, and Creativity (2003) , the Committee on Information Technology and Creativity, National Research Council, Abrufdatum: 08.03.2013
  35. Computer-aided design, Wikipedia, Abrufdatum: 14.03.2013
  36. CAD and Creativity: Does the Computer Really Help? Bryan Lawson, Abrufdatum: 14.03.2013
  37. 3D printing, Wikipedia, Abrufdatum: 14.03.2013
  38. E-commerce: Not a new economic model, InfoRefuge, Abrufdatum: 14.03.2013
  39. 3D printing, Wikipedia, Abrufdatum: 14.03.2013
  40. Importance of Computers in Fashion Designing, Kieve Kavanaugh, Abrufdatum: 14.03.2013
  41. Retail 2.0 The eight trends reshaping the face of global retail – and how businesses can react, DHL, Abrufdatum: 14.03.2013
  42. Gamification Comes of Age, Adam Swann, Abrufdatum: 13.03.2013
  43. The Gamification of Business, Brian Burke, Abrufdatum: 13.03.2013
  44. Gamification Comes of Age, Adam Swann, Abrufdatum: 13.03.2013
  45. The Gamification of Business, Brian Burke, Abrufdatum: 13.03.2013
  46. Addison’s Tales Approductions, Jerome Goerke, Abrufdatum: 13.03.2013
  47. Which programming languages can be used to develop in Android, Community Message Board, Abrufdatum: 13.03.2013
  48. Hip hop for PHP, Haiping Zhao, Abrufdatum: 13.03.2013
  49. BigPipe: Pipelining web pages for high performance, Changhao Jiang, Abrufdatum: 13.03.2013
  50. 28 major trends for 2012 and beyond, Thomas Frey, Abrufdatum: 13.03.2013