A broad range of industrial and consumer products use computers as control systems, including simple special-purpose devices like microwave ovens and remote controls, and factory devices like industrial robots. Computers are at the core of general-purpose devices such as personal computers and mobile devices such as smartphones. Computers power the Internet, which links billions of computers and users.
Early computers were meant to be used only for calculations. Simple manual instruments like the abacus have aided people in doing calculations since ancient times. Early in the Industrial Revolution, some mechanical devices were built to automate long, tedious tasks, such as guiding patterns for looms. More sophisticated electrical machines did specialized analog calculations in the early 20th century. The first digital electronic calculating machines were developed during World War II, both electromechanical and using thermionic valves. The first semiconductor transistors in the late 1940s were followed by the silicon-based MOSFET (MOS transistor) and monolithic integrated circuit chip technologies in the late 1950s, leading to the microprocessor and the microcomputer revolution in the 1970s. The speed, power, and versatility of computers have been increasing dramatically ever since then, with transistor counts increasing at a rapid pace (Moore's law noted that counts doubled every two years), leading to the Digital Revolution during the late 20th and early 21st centuries.
Conventionally, a modern computer consists of at least one processing element, typically a central processing unit (CPU) in the form of a microprocessor, together with some type of computer memory, typically semiconductor memory chips. The processing element carries out arithmetic and logical operations, and a sequencing and control unit can change the order of operations in response to stored information. Peripheral devices include input devices (keyboards, mice, joystick, etc.), output devices (monitor screens, printers, etc.), and input/output devices that perform both functions (e.g., the 2000s-era touchscreen). Peripheral devices allow information to be retrieved from an external source, and they enable the results of operations to be saved and retrieved.
Device mobility can be viewed in the context of several qualities:[2]
Physical dimensions and weight Whether the device is mobile or some kind of host to which it is attached is mobile What kind of host devices it can be bound with How devices communicate with a host When mobility occurs
Strictly speaking, many so-called mobile devices are not mobile. It is the host that is mobile, i.e., a mobile human host carries a non-mobile smartphone device. An example of a true mobile computing device, where the device itself is mobile, is a robot. Another example is an autonomous vehicle.
There are three basic ways mobile devices can be physically bound to mobile hosts:
Accompanied, Surface-mounted, or Embedded into the fabric of a host, e.g., an embedded controller in a host device.
Accompanied refers to an object being loosely bound and accompanying a mobile host, e.g., a smartphone can be carried in a bag or pocket but can easily be misplaced.[2] Hence, mobile hosts with embedded devices such as an autonomous vehicle can appear larger than pocket-sized.
The most common size of a mobile computing device is pocket-sized, but other sizes for mobile devices exist. Mark Weiser, known as the father of ubiquitous computing,[3] referred to device sizes that are tab-sized, pad, and board sized,[4] where tabs are defined as accompanied or wearable centimeter-sized devices, e.g. smartphones, phablets and tablets are defined as hand-held decimeter-sized devices. If one changes the form of the mobile devices in terms of being non-planar, one can also have skin devices and tiny dust-sized devices.[2]
Dust refers to miniaturized devices without direct HCI interfaces, e.g., micro-electromechanical systems (MEMS), ranging from nanometers through micrometers to millimeters. See also Smart dust. Skin: fabrics based upon light emitting and conductive polymers and organic computer devices. These can be formed into more flexible non-planar display surfaces and products such as clothes and curtains, see OLED display. Also, see smart device.
Although mobility is often regarded as synonymous with having wireless connectivity, these terms are different. Not all network access by mobile users, applications, and devices needs to be via wireless networks and vice versa. Wireless access devices can be static and mobile users can move between wired and wireless hotspots such as in Internet cafés.[2] Some mobile devices can be used as mobile Internet devices to access the Internet while moving, but they do not need to do this and many phone functions or applications are still operational even while disconnected from the Internet. What makes the mobile device unique compared to other technologies is the inherent flexibility in the hardware and software. Flexible applications include video chat, web browsing, payment systems, near field communication, audio recording etc.[5] As mobile devices become ubiquitous, there will be an increase of services which include the use of the cloud.[6] Although a common form of mobile device, a smartphone, has a display, another perhaps even more common form of smart computing device, the smart card, e.g., used as a bank card or travel card, does not have a display. This mobile device often has a CPU and memory but needs to connect or be inserted into a reader to display its internal data or state.
Building on previous innovations in mathematics and technology, software was created for the programmable digital computers that emerged in the late 1940s and was necessary to realize their usefulness. The first software was tied closely to the underlying computer hardware, but over time, the lower layers of the system have become more standardized, and software has become increasingly portable between different systems and abstracted from the underlying machine code. Operating systems manage the hardware resources and mediate between different applications that accomplish tasks for the user. Programming languages are the format in which software is written, and must be both human-readable and capable of being translated into unambiguous instructions for computer hardware. Compilers or interpreters are needed to link a program with other code that it relies on and convert the software into machine code that can be executed on the hardware. Programs are combined with each other and with external input to be capable of accomplishing a complex task.
Software development's central task is programming and maintaining a project's source code, but the term also covers conceiving the project, evaluating its feasibility, analyzing the business requirements, software design, and release. Software quality assurance, including code review and testing, is an essential part of the process, as is delivering quality code lowers the cost of reliability failures, cyberattacks enabled by security vulnerabilities, and maintenance costs. Maintenance typically consumes 75 percent or more of the software's lifetime engineering budget. Source code is protected by copyright law, which vests the owner with the exclusive right to copy the code. Software has become ubiquitous in everyday life in developed countries. In many cases, software augments the functionality of pre-existing technologies, but it has also enabled the creation of entirely new technologies such as the Internet, video games, social media, mobile phones, and GPS.
The Internet (or internet)[a] is the global system of interconnected computer networks that uses the Internet protocol suite (TCP/IP)[b] to communicate between networks and devices. It is a network of networks that consists of private, public, academic, business, and government networks of local to global scope, linked by a broad array of electronic, wireless, and optical networking technologies. The Internet carries a vast range of information resources and services, such as the interlinked hypertext documents and applications of the World Wide Web (WWW), electronic mail, telephony, and file sharing.
The origins of the Internet date back to research that enabled the time-sharing of computer resources and the development of packet switching in the 1960s which led to the design of computer networks for data communication.[2][3] The set of rules (communication protocols) to enable internetworking on the Internet arose from research and development commissioned in the 1970s by the Defense Advanced Research Projects Agency (DARPA) of the United States Department of Defense in collaboration with universities and researchers across the United States and in the United Kingdom and France.[4][5][6] The ARPANET initially served as a backbone for the interconnection of regional academic and military networks in the United States to enable resource sharing. The funding of the National Science Foundation Network as a new backbone in the 1980s, as well as private funding for other commercial extensions, encouraged worldwide participation in the development of new networking technologies and the merger of many networks using DARPA's Internet protocol suite.[7] The linking of commercial networks and enterprises by the early 1990s, as well as the advent of the World Wide Web,[8] marked the beginning of the transition to the modern Internet,[9] and generated sustained exponential growth as generations of institutional, personal, and mobile computers were connected to the network. Although the Internet was widely used by academia in the 1980s, the subsequent commercialization in the 1990s and beyond incorporated its services and technologies into virtually every aspect of modern life.
Most traditional communication media, including telephone, radio, television, paper mail, and newspapers, are reshaped, redefined, or even bypassed by the Internet, giving birth to new services such as email, Internet telephone, Internet television, online music, digital newspapers, and video streaming websites. Newspapers, books, and other print publishing have adapted to website technology or have been reshaped into blogging, web feeds, and online news aggregators. The Internet has enabled and accelerated new forms of personal interaction through instant messaging, Internet forums, and social networking services. Online shopping has grown exponentially for major retailers, small businesses, and entrepreneurs, as it enables firms to extend their "brick and mortar" presence to serve a larger market or even sell goods and services entirely online. Business-to-business and financial services on the Internet affect supply chains across entire industries.
The Internet has no single centralized governance in either technological implementation or policies for access and usage; each constituent network sets its own policies.[10] The overarching definitions of the two principal name spaces on the Internet, the Internet Protocol address (IP address) space and the Domain Name System (DNS), are directed by a maintainer organization, the Internet Corporation for Assigned Names and Numbers (ICANN). The technical underpinning and standardization of the core protocols is an activity of the Internet Engineering Task Force (IETF), a non-profit organization of loosely affiliated international participants that anyone may associate with by contributing technical expertise.[11] In November 2006, the Internet was included on USA Today's list of the New Seven Wonders.[12]