Electronic technology, Article about Electronic technology by The Free Dictionary

a current news article about computer technology

electronics

electronics,

elementary particle carrying a unit charge of negative electricity. Ordinary electric current is the flow of electrons through a wire conductor (see electricity). The electron is one of the basic constituents of matter.

. Click the link for more information. or other carriers of electric charge, especially in semiconductor semiconductor,

solid material whose electrical conductivity at room temperature is between that of a conductor and that of an insulator (see conduction; insulation). At high temperatures its conductivity approaches that of a metal, and at low temperatures it acts as an insulator.

. Click the link for more information. devices. It is one of the principal branches of electrical engineering engineering,

profession devoted to designing, constructing, and operating the structures, machines, and other devices of industry and everyday life. Types of Engineering

. Click the link for more information. . The invention of the transistor, announced in 1948, and the subsequent development of integrated circuits integrated circuit

(IC), electronic circuit built on a semiconductor substrate, usually one of single-crystal silicon. The circuit, often called a chip, is packaged in a hermetically sealed case or a nonhermetic plastic capsule, with leads extending from it for input, output,

. Click the link for more information. have brought about revolutionary changes in electronics, which was previously based on the technology of the electron tube electron tube,

device consisting of a sealed enclosure in which electrons flow between electrodes separated either by a vacuum (in a vacuum tube) or by an ionized gas at low pressure (in a gas tube).

. Click the link for more information. . The miniaturization and savings in power brought about by these developments have allowed electronic circuits to be packaged more densely, making possible compact computers, advanced radar and navigation systems, and other devices that use very large numbers of components (see microelectronics microelectronics,

branch of electronic technology devoted to the design and development of extremely small electronic devices that consume very little electric power. Although the term is sometimes used to describe discrete electronic components assembled in an extremely small

. Click the link for more information. ). It has also brought to the consumer such items as smaller and more reliable radio radio,

transmission or reception of electromagnetic radiation in the radio frequency range. The term is commonly applied also to the equipment used, especially to the radio receiver.

. Click the link for more information. and television television,

transmission and reception of still or moving images by means of electrical signals, originally primarily by means of electromagnetic radiation using the techniques of radio, now also by fiber-optic and coaxial cables and other means.

. Click the link for more information. receivers, advanced sound- and video-recording and reproducing systems, microwave ovens microwave oven,

device that uses microwaves to rapidly cook food. The microwaves cause water molecules in the food to vibrate, producing heat, which is distributed through the food by induction. A special electron tube called a magnetron produces the microwaves.

. Click the link for more information. , cellular telephones, and powerful yet inexpensive personal computers. The consumer electronics industry—which began in 1920 when radio broadcasting started in the United States—accounts for annual sales of close to $50 billion in the United States alone. Because of advances in electronics manufacturing technology, the cost of electronic products often decreases even as quality and reliability increase. Power requirements are continually reduced, allowing greater portability.

Electronics

the science that deals with the interaction of electrons and electromagnetic fields and with the methods of developing electronic devices and equipment, in which the interaction is used to convert electromagnetic energy for, primarily, the transmission, processing, and storage of information. The most typical conversions are the generation, amplification, and detection of electromagnetic oscillations at frequencies of up to 10 12 hertz (Hz), as well as at frequencies in the range from 10 12 Hz to 10 20 Hz, which includes infrared, visible, and ultraviolet radiation and X-rays. Conversion at such high frequencies is possible because of the exceptionally low response time of the electron, which is the smallest of all known charged particles. Electronics investigates the interactions of electrons both with the macro-fields in the working cavities of electronic devices and with the microfields in atoms, molecules, and crystal lattices.

Electronics is based on various branches of physics—electrodynamics, classical and quantum mechanics, optics, thermodynamics, and solid-state physics—and on such sciences as chemistry, metallurgy, and crystallography. Using the findings of these and other bodies of knowledge, electronics defines new tasks for other sciences, thereby stimulating their development. In addition, electronics creates devices and equipment that provide the sciences with new means and methods of investigation.

An important practical contribution of electronics is the development of devices that perform various functions in systems used for the conversion and transmission of information, in control systems, in computer apparatus, and in equipment for the energy industry. Electronics also formulates the scientific principles underlying the technology used in the manufacture of electronic devices and the technology that applies electronic and ionic processes and devices to various fields of science and engineering.

Electronics is playing a leading role in the scientific and technical revolution. The introduction of electronic devices in various areas of human activity contributes in large and often decisive measure to the resolution of complex scientific and technical problems, to an increase in the productivity of physical and mental labor, and to the improvement of economic indexes of production. The achievements of electronics have formed the basis of an industry that produces electronic equipment used in communications, automation, television, radar, computer technology, instrument-making, and industrial-process control systems, as well as illuminating-engineering, infrared, and X-ray equipment.

Historical survey. Electronics as a science originated in the early 20th century, after a series of important advances had been recorded. Between 1856 and 1873, the principles of electrodynamics were formulated. Thermionic emission was investigated between 1882 and 1901, photoelectric emission between 1887 and 1905, and X rays between 1895 and 1897. J. J. Thompson discovered the electron in 1897, and between 1892 and 1909 the classical electron theory took shape.

The development of electronics began with the invention of the diode tube by J. A. Fleming in 1904 and the three-electrode tube, or triode, by L. De Forest in 1906. In 1913, the German engineer A. Meissner used a triode to generate electric oscillations. Between 1919 and 1925, M. A. Bonch-Bruevich developed powerful water-cooled vacuum-tube generators for radio transmitters used in long-range radio communication and broadcasting.

An experimental prototype of a phototube was constructed by A. G. Stoletov in 1888, and industrial models were prepared by the German scientists J. Elster and H. Geitel in 1910. P. V. Timofeev developed a single-stage multiplier phototube in 1928, and L. A. Kubetskii a multistage multiplier phototube in 1930.

The invention of the phototube made sound motion pictures possible. In addition, such television camera tubes as the vidicon, the iconoscope, the image iconoscope, and the image orthicon were developed on the basis of the phototube. A design for the vidicon was proposed by A. A. Chernyshev in 1925. S. I. Kataev and V. K. Zworykin, working independently of each other, developed iconoscopes in 1931 and 1932, and P. V. Timofeev and P. V. Shmakov invented the image iconoscope in 1933. The image orthicon was first described in 1946 by the American scientists A. Rose, P. Weimer, and H. Law. A two-sided target for such a tube, however, was proposed in 1939 by the Soviet scientist G. V. Braude.

A framework for the development of radar in the centimeter range was provided by the invention of the multiresonator magnetron in 1936–37 by N. F. Alekseev and D. E. Maliarov, who were working under M. A. Bonch-Bruevich. The development of a reflex klystron in 1940 by N. D. Deviatkov and co-workers and, independently, by the Soviet engineer V. F. Kovalenko also contributed to this framework. The drift-tube klystron and the traveling-wave tube, which was developed in 1943 by the American scientist R. Kompfner, made possible development of radio relay communication systems and particle accelerators and contributed to the creation of space communication systems. The concept of the drift-tube klystron was proposed in 1932 by D. A. Rozhanskii. The device was developed in 1935 by the Soviet physicist A. N. Arsen’eva and the German physicist O. Heil and constructed in 1938 by the American physicists R. and S. Varian and others.

Gas-discharge, or ion, devices were developed and improved concurrently with vacuum-tube devices. Such gas-discharge devices included mercury-arc rectifiers, which are used primarily to convert alternating current to direct current in high-power industrial installations; thyratrons, which shape powerful pulses of electric current in pulse-forming devices; and gas-discharge light sources.

Semiconductor electronics began with the use of crystalline semiconductors as detectors in radio receivers between 1900 and 1905. Its development continued with the invention of copper oxide and selenium current rectifiers and photocells between 1920 and 1926 and with O. V. Losev’s invention of the oscillating crystal receiver in 1922. W. Shockley, W. Brattain, and J. Bardeen’s invention of the transistor, in 1948, marked the beginning of an era of expansion in the field.

The development of the planar process for fabricating semiconductor structures and of methods of integrating a large number of microelements, such as transistors, diodes, capacitors, and resistors, on a single-crystal semiconductor wafer led to a new trend in electronics—microelectronics (see alsoINTEGRATED ELECTRONICS). The principal efforts in integrated electronics have been aimed at the development of integrated circuits—microminiature electronic devices, such as amplifiers, converters, central processing units, and memories. Integrated circuits consists of hundreds or even thousands of electronic devices placed on a single semiconductor crystal that has an area of several square millimeters. Microelectronics has provided new opportunities for the solution of such problems associated with the growth of contemporary social production as the automation of industrial process control, the processing of information, and the improvement of computer technology.

The invention of the maser—a quantum electronics device developed in 1955 by N. G. Basov and A. M. Prokhorov and, independently, by C. Townes—revealed the unique potential of electronics that is associated with use of the powerful coherent light of lasers and with the synthesis of extremely precise quantum frequency standards.

Soviet scientists have made a large contribution to the development of electronics. Fundamental investigations in the physics and technology of electronic devices have been carried out by many researchers, including M. A. Bonch-Bruevich, L. I. Mandel’shtam, N. D. Papaleksi, S. A. Vekshinskii, A. A. Chernyshev, and M. M. Bogoslovskii. B. A. Vvedenskii, V. D. Kalmykov, A. L. Mints, A. A. Raspletin, and M. V. Shuleikin are among those who have studied problems associated with the excitation and conversion of electric oscillations, with the emission, propagation, and reception of radio waves, and with the interaction of radio waves and current carriers in a vacuum, in gases, and in solids. A. F. Ioffe carried out original research in the physics of semiconductors, S. I. Vavilov in luminescence and other areas of physical optics, and I. E. Tamm in the quantum theory of the scattering of light, in radiation theory, and in the theory of the photoeffect in metals.

Fields, principal branches, and areas of application. Electronics comprises three fields of research: vacuum electronics, solid-state electronics, and quantum electronics. Each field is subdivided into a number of branches and a number of areas of application. A branch combines groups of like physiochemical phenomena and processes that are of fundamental importance to the development of many classes of electronic devices in a given area. An area of application encompasses not only the methods of designing and constructing electronic devices that are similar in operating principle or function but also the techniques used in the devices’ manufacture.

VACUUM ELECTRONICS. Vacuum electronics includes the following branches: (1) emission electronics, which encompasses thermionic emission, photoemission, secondary emission, and field emission, as well as problems of cathodes and antiemission coatings; (2) the formation and control of electron and ion fluxes; (3) the formation of electromagnetic fields with resonators, resonator systems, slow-wave circuits, and power input and output devices; (4) electronoluminescence, or cathodoluminescence; (5) the physics and technology of high vacuums, that is, the production, maintenance, and monitoring of high vacuums; (6) thermal processes such as vaporization in a vacuum, deformation of parts under cyclic heating, surface breakdown of metals under pulsed heating, and heat discharge of equipment components; (7) surface phenomena associated with the formation of films on electrodes and insulators and of irregularities on electrode surfaces; (8) surface-treatment technology, which includes treatment with electron beams, ions, and lasers; and (9) gas media, a branch that includes aspects of the production and maintenance of optimal gas composition and pressure in gas discharge devices.

The principal areas of application of vacuum electronics encompass aspects of the development of various electron-tube devices. These devices include such vacuum tubes as triodes, tetrodes, and pentodes; such microwave tubes as magnetrons and klystrons; such electron-beam devices as picture tubes and oscillograph tubes; such photoelectric devices as phototubes and photomultipliers; X-ray tubes; and such gas-discharge devices as high-power rectifiers, light sources, and indicators.

SOLID-STATE ELECTRONICS. The branches and areas of application of solid-state electronics are associated primarily with semiconductor electronics. The principal branches of semiconductor electronics are the following: (1) the study of the properties of semiconductor materials and the effects of impurities on those properties; (2) the creation of areas of differing conductivity on a single crystal by means of epitaxy (see ), diffusion, ion implantation, or irradiation of semiconductor structures; (3) the application of dielectric and metallic films on semiconductor materials and the development of the technology for fabricating films with the necessary properties and configurations; (4) the investigation of the physical and chemical processes that occur on semiconductor surfaces; and (5) the development of methods and equipment for producing and measuring microelements that are a few micrometers or less in size.

The basic areas of application of semiconductor electronics are associated with the development and manufacture of various types of semiconductor devices. Such devices include semiconductor diodes (rectifier, mixer, parametric, and avalanche diodes), amplifier and oscillator diodes (tunnel, avalanche transit time, and Gunn diodes), transistors (bipolar and unipolar), thyristors, optoelectronic devices (light-emitting diodes, photo-diodes, phototransistors, optrons, and light-emitting-diode and photodiode matrices), and integrated circuits.

The areas of application of solid-state electronics also include dielectric electronics, magnetoelectronics, acoustoelectronics, piezoelectronics, cryoelectronics, and the development and manufacture of resistors.

Dielectric electronics deals with the electronic processes that occur in dielectrics—particularly in thin dielectric films—and the use of such processes in, for example, the development of dielectric diodes and capacitors. Magnetoelectronics makes use of the magnetic properties of matter to control the flow of electromagnetic energy by means of ferrite isolators, circulators, and phase shifters and to develop memories, including those based on ferromagnetic domains.

Acoustoelectronics and piezoelectronics deal with the propagation of acoustic surface and body waves, the variable electric fields that such waves generate in crystalline materials, and the interaction of the fields with electrons in devices with a piezoelectric semiconductor structure, such as quartz frequency stabilizers, piezoelectric filters, ultrasonic delay lines, and acoustoelectronic amplifiers. Cryoelectronics, in which the changes brought about in the properties of solids by extremely low temperatures are studied, involves the construction of low-noise microwave amplifiers and oscillators and ultrahigh-speed computers and memories, as well as the design and manufacture of resistors.

QUANTUM ELECTRONICS. The most important application of quantum electronics is the development of lasers and masers. Quantum electronics devices serve as the basis of instruments used for the accurate measurement of distances (range finders), quantum frequency standards, quantum gyroscopes, optical-frequency multichannel communication systems, long-range space communication systems, and radio astronomy. The powerful action of laser radiation on matter is made use of in industry. Lasers also find application in biology and medicine.

Electronics is in a stage of intense development. New fields of electronics are evolving, and new areas of application in the current fields are being found.

The technology of electronic devices. The design and manufacture of electronic devices are based on the use of physicochemical processes and a combination of various properties of materials. It is necessary, therefore, to understand thoroughly the processes used and their effects on the properties of the devices and to be able to control the processes with precision.

The great importance of physicochemical research and of the development of the scientific bases of engineering in electronics stems from the dependence of the properties of electronic devices on the presence of doping agents and substances adsorbed on the surfaces of a device’s working elements, as well as the dependence of the properties on gas composition and the degree of rarefaction of the medium surrounding the elements. It is also due to the dependence of the reliability and service life of electronic devices on the degree of stability of the raw materials used and on the controllability of the fabrication technology.

Technological advances often stimulate the development of new areas of application in electronics. Engineering features common to all areas of application of electronics are the requirements—exceptionally high in comparison with other branches of technology—that the electronics industry imposes on the properties of the raw materials used, on the degree of protection provided for the workpieces during production, and on the geometric precision of the fabrication of electronic devices.

Fulfillment of the first of these requirements makes possible the synthesis of ultrapure materials with a structure that has a high degree of perfection and with predetermined physicochemical properties. The development of such materials—which include special composites of single crystals, ceramics, and glasses—and the study of their properties constitute the subject of a special scientific and engineering discipline called electronic materials science.

One of the most acute engineering problems associated with the second requirement is dust control in the gaseous medium in which the most critical fabrication processes take place. In many cases, no more than three dust particles of less than 1 micrometer in diameter are acceptable per cubic meter.

The requirements for geometric precision in the fabrication of electronic devices are exceedingly stringent. Often, the relative error in size cannot exceed 0.001 percent, and the dimensions and relative positions of the elements of integrated circuits must be accurate to hundredths of a micrometer. Such stringency dictates that new, more advanced methods of working with materials be developed, as well as new techniques and equipment for quality control.

Manufacturing processes in electronics require extensive use of the latest methods and technology, which include electron-beam, ultrasonic, and laser processing and welding; photolithography and electron-beam and X-ray lithography; electron-discharge machining; ion implantation; plasma chemistry; molecular epitaxy; electron microscopy; and techniques that employ vacuum devices with a residual gas pressure of as low as 10 –13 mm Hg.

Apart from the general aims of increasing labor productivity, the automation of the production of electronic devices through the use of computers is made imperative by a degree of complexity in many manufacturing processes that requires the elimination of subjective human influence. These and other features of the manufacturing processes in electronics have necessitated the creation of a new area of application of machine building—electronic machine building.

Prospects for development. One of the primary problems facing electronics is the need to reduce the size and power consumption of computer and electronic control systems while increasing the amounts of information processed. This problem is being solved in a number of ways. Integrated circuits that have a switching time of as little as 10 –11 sec are being developed, and the degree of integration is being increased so that as many as 1 million transistors can placed on a crystal 1–2 micrometers long. Optical-frequency communication devices, optoelectronic converters, and superconductors are being used in integrated circuits, and memories with capacities of several megabits are being designed for single semiconductor crystals. The problem is also being addressed by the use of laser and electron-beam switching and by the expansion of the functional capabilities of integrated circuits. For example, microcomputers, rather than simply microprocessors, are being placed on single semiconductor crystals. The changeover from the two-dimensional, or planar, technology of integrated circuits to three-dimensional, or bulk, technology and the use of a combination of the various properties of a solid in one device are also helping to solve the problem. The principles and techniques of stereoscopic television, which can convey more information than can conventional television, are being developed and implemented, and electronic devices operating in the millimeter and submillimeter regions are being fabricated for wide-band and, consequently, more efficient data transmission systems.

Electronic methods and equipment are used in biology in the study of cells and the structure and responses of living organisms and in medicine in diagnostics, therapy, and surgery. As electronics develops and the technology of the production of electronic devices improves, the range of application of electronics will expand in all areas of human life and activity, and the role of electronics in accelerating scientific and technical progress will grow.

electronics

Electronics

Technology involving the manipulation of voltages and electric currents through the use of various devices for the purpose of performing some useful action. This large field is generally divided into two primary areas, analog electronics and digital electronics.

Analog electronics

Historically, analog electronics was used in large part because of the ease with which circuits could be implemented with analog devices. However, as signals have become more complex, and the ability to fabricate extremely complex digital circuits has increased, the disadvantages of analog electronics have increased in importance, while the importance of simplicity has declined.

In analog electronics, the signals to be manipulated take the form of continuous currents or voltages. The information in the signal is carried by the value of the current or voltage at a particular time t. Some examples of analog electronic signals are amplitude-modulated (AM) and frequency-modulated (FM) radio broadcast signals, thermocouple temperature data signals, and standard audio cassette recording signals. In each of these cases, analog electronic devices and circuits can be used to render the signals intelligible.

Commonly required manipulations include amplification, rectification, and conversion to a nonelectronic signal. Amplification is required when the strength of a signal of interest is not sufficient to perform the task that the signal is required to do. However, the amplification process suffers from the two primary disadvantages of analog electronics: (1) susceptibility to replication errors due to nonlinearities in the amplification process and (2) susceptibility to signal degradation due to the addition, during the amplification process, of noise originating from the analog devices composing the amplifier. These two disadvantages compete with the primary advantage of analog electronics, the ease of implementing any desired electronic signal manipulation. See Amplifier, Distortion (electronic circuits)

Digital electronics

The advent of the transistor in the 1940s made it possible to design simple, inexpensive digital electronic circuits and initiated the explosive growth of digital electronics. Digital signals are represented by a finite set of states rather than a continuum, as is the case for the analog signal. Typically, a digital signal takes on the value 0 or 1; such a signal is called a binary signal. Because digital signals have only a finite set of states, they are amenable to error-correction techniques; this feature gives digital electronics its principal advantage over analog electronics. See Transistor

In common two-level digital electronics, signals are manipulated mathematically. These mathematical operations are known as boolean algebra. The operations permissible in boolean algebra are NOT, AND, OR, and XOR, plus various combinations of these elemental operations.

Electronic circuits are composed of various electronic devices, such as transistors, resistors, and capacitors. In circuits built from discrete components, the components are typically soldered together on a fiberglass board known as a printed circuit board. On one or more surfaces of the printed circuit board are layers of conductive material which has been patterned to form the interconnections between the different components in the circuit. In some cases, the circuits necessary for a particular application are far too complex to build from individual discrete components, and integrated-circuit technology must be employed. Integrated circuits are fabricated entirely from a single piece of semiconductor substrate. It is possible in some cases to put several million electronic devices inside the same integrated circuit. Many integrated circuits can be fabricated on a single wafer of silicon at one time, and at the end of the fabrication process the wafer is sawed into individual integrated circuits. These small pieces, or chips as they are popularly known, are then packaged appropriately for their intended application. See Capacitor, Integrated circuits, Printed circuit

The microprocessor is the most important integrated circuit to arise from the field of electronics. This circuit consists of a set of subcircuits that can perform the tasks necessary for computation and are the heart of modern computers. Microprocessors that understand large numbers of instructions are called complete instruction set computers (CISCs), and microprocessors that have only a very limited instruction set are called reduced instruction set computers (RISCs). See Digital computer

Other circuit designs have been standardized and reduced to integrated-circuit form as well. An example of this process is seen in the telephone modem. Modulation techniques have been standardized to permit the largest possible data-transfer rates in a given amount of bandwidth, and standardized modem chips are available for use in circuit design. See Modem

The memory chip is another important integrated electronic circuit. This circuit consists of a large array of memory cells composed of a transistor and some other circuitry. As the storage capacity of the memory chip has increased, significant miniaturization has taken place. See Circuit (electronics)

electronics

  • electronic raster scanning
  • electronic reconnaissance
  • electronic recording
  • Electronic Report Management
  • electronic repossession
  • electronic robot
  • electronic scanning
  • electronic sculpturing
  • electronic security
  • electronic signature
  • electronic sky screen equipment
  • electronic specific heat
  • electronic spectrum
  • electronic speedometer
  • electronic spreadsheet
  • Electronic Spying, Tracking, and Harassment
  • electronic state
  • Electronic Stimulator
  • electronic structure
  • electronic support measures
  • electronic surge arrester
  • electronic switch
  • electronic switching
  • electronic tablet
  • electronic tachometer
  • Electronic technology
  • Electronic Theories in Organic Chemistry
  • electronic thermometer
  • electronic tonometer
  • electronic tuning
  • electronic typewriter
  • electronic video recording
  • Electronic Voice Phenomena
  • electronic voltage regulator
  • electronic voltmeter
  • electronic warfare
  • electronic warfare support
  • electronic warfare support measures
  • electronic whiteboarding
  • electronic work function
  • electronic writing
  • electronically agile radar
  • electronics
  • electronics industry
  • Electronics Industry Association
  • Electron-Inertia Experiments
  • electron-multiplier phototube
  • electronographic camera
  • electronographic tube
  • Electronography
  • electronoluminescence
  • Electronic Tactical Action Report
  • electronic tag
  • electronic tag
  • electronic tag
  • electronic tag
  • electronic tag
  • Electronic tagging
  • Electronic Tandem Network
  • Electronic Tandem Switching
  • Electronic Tandem Switching Administration Channel Interface
  • Electronic Target Acquisition
  • Electronic Target Folder
  • Electronic Target Range
  • Electronic Tariff Filing System
  • Electronic Tax Administration
  • Electronic Tax Administration Advisory Committee
  • Electronic Tax Information & Data Exchange System
  • Electronic Tax Law Assistance
  • Electronic Tax Preparer
  • Electronic Tax Register
  • Electronic Teaching Assistance Program
  • Electronic Teaching Portfolio Web Site
  • Electronic Technical Manual
  • Electronic Technical Manual Deficiency Evaluation Report
  • Electronic Technical Manual Interface
  • Electronic Technical Manuals - Interface
  • Electronic Technical Publishing System
  • Electronic Technical Reference Manual
  • Electronic Technical Suitability Test
  • Electronic Technican
  • Electronic technology
  • Electronic Technology Systems Inc.
  • Electronic Technology Task Force
  • Electronic Telecom Operations Map
  • Electronic Telecommunications Relay Services Forum
  • Electronic Telecommunications Switching System
  • Electronic Telephone Set
  • Electronic Temperature and Acceleration Controller
  • Electronic Tendering System
  • Electronic Terrain and Obstacle Data
  • Electronic Territory Management System
  • Electronic Test
  • Electronic Test and Development Centre
  • Electronic Test Automation
  • Electronic test equipment
  • Electronic Test Facility
  • Electronic Test Requirements Analysis
  • Electronic Test Set
  • Electronic Test Station
  • Electronic Testers Management Information System
  • electronic text
  • electronic text book
  • electronic text book
  • Electronic Text Centre Leiden
  • Electronic Text Corpus of Sumerian Literature
  • Electronic Text Research Center
  • Electronic Text Services
  • Electronic Textual Cultures Lab
  • Electronic Textual Editing
  • Electronic Theatre Controls
  • Electronic Theories in Organic Chemistry
  • Terms of Use
  • Privacy policy
  • Feedback
  • Advertise with Us

Copyright © 2003-2017 Farlex, Inc

All content on this website, including dictionary, thesaurus, literature, geography, and other reference data is for informational purposes only. This information should not be considered complete, up to date, and is not intended to be used in place of a visit, consultation, or advice of a legal, medical, or any other professional.

DNA can now store images, video and other types of data, Science News for Students

computers & electronics, technology,

DNA can now store images, video and other types of data

All of the data from more than 600 smartphones — 10,000 gigabytes worth — could be stored on the tiny pink smear of DNA at the end of this test tube.

This is one in a series presenting news on technology and innovation, made possible with generous support from the Lemelson Foundation.

With a smartphone, you can look up facts, stream videos, check out Facebook, read tweets and listen to music. But all of those data aren’t stored on your phone. They are kept somewhere else, perhaps half a world away. For now, companies like Microsoft, Amazon and Facebook store those data on magnetic tapes or other media. It’s an ever-growing library of data that takes up lots of space in sprawling data centers. And even the best storage media last only a few decades at most. Then they need to be replaced. But there may be a better way to keep and guard information, some researchers say. Store and retrieve it — with DNA.

DNA holds the genetic information that tells each cell inside a living being what to do. Each side of a DNA molecule’s twisted, ladder-like structure is made of four chemical building blocks. They’re called nucleotides and are known as A, T, C and G. (The letters stand for adenine, thymine, cytosine and guanine.) In various combinations, these letters spell out the code for our genes.

Computers currently store data as series of 0s and 1s. But data also can be written using the four building blocks of DNA, says Luis Ceze. As a computer architect at the University of Washington in Seattle, he studies how computers and data systems should be designed and function. Labs can make strands of synthetic DNA, one nucleotide block at a time. Combinations can be developed, as a code, to stand for numbers, letters or other digital information. Later, other lab equipment can translate those building blocks along a strand of DNA. In that way, they can decode the original data.

Why bother? DNA can hold lots of information in a tiny space. In theory, a volume of DNA the size of a sugar cube could hold as much data as a Walmart-sized storage center. Plus, Ceze says, unlike magnetic tape, DNA can last unchanged for thousands of years.

Work on DNA data storage started years ago. Ceze’s team has just added what’s known as “random access” to the method. It offers a way to find a specific file. Each data file gets its own unique “address.” It works in much the same way as a house number, street name and zip code guide a mail carrier to your home. The researchers add those digital addresses to each DNA strand holding data for a particular file.

Ceze’s team, which included people from Microsoft, reported its new advance on April 6 in Atlanta, Ga. The advance was presented at the International Conference on Architectural Support for Programming Languages and Operating Systems.

A borrowed tool

To search for a specific file in a large quantity of DNA, the Seattle team uses a tool called PCR. It’s short for polymerase (Puh-LIM-er-ase) chain reaction. Here’s how PCR works: DNA goes into a test tube, along with strings of nucleotides known as primers. Each primer is chosen to match the address sequences at the ends of selected DNA strands. Single nucleotides and a few other things are in the mix, too. The test tube then goes into a machine that heats and cools the soup of genetic material over and over.

Heating up double-stranded DNA separates it into single threads. After the sample cools down, the primers seek out and bind to the ends of the specific strands that scientists are interested in. Single nucleotides in the mix then bind to the rest of the strand.

Each time the heating and cooling cycle repeats, it’s like pressing start on a copying machine; the PCR duplicates DNA. These cycles repeat over and over and over, making millions of copies of the target DNA. Scientists describe this as “amplifying” the DNA.

PCR will copy desired snippets of DNA so many times that soon they greatly outnumber all of the rest of the genetic material in a sample.

Many scientists already use PCR. It’s used to copy the DNA found at a crime scene, for instance. That lets forensic scientists work with the DNA and compare it to other samples, such as one from a suspect. Similarly, environmental scientists might use PCR to amplify the foreign DNA they find in a river in hopes of matching it to a particular species of fish.

Making lots of copies of a specific bit of DNA can now help pick out a data file, Ceze says.

He compares the idea to trying to get a bowl of alphabet soup with only certain letters. Picking out individual letters would take a really long time. But suppose you were able to selectively copy, over and over, just the letters you liked. Eventually, nearly every scoop you took out of the bowl would contain just what you wanted. Likewise, PCR can make sure that the DNA picked out after the process is pretty much just what you had been looking for. Then lab equipment can read that DNA to decode its stored data.

PCR is a pretty standard tool in genetics research. But borrowing that tool to find specific DNA data files didn’t happen until Ceze took a break from his regular work and spent time in a microbiology lab. There, he learned about PCR. And that led to the team’s idea for random access. “You see two things, and then all of a sudden you see that they could be connected,” he explains.

Avoiding errors

Making and copying large amounts of DNA is “hard to control exactly,” Ceze says. So his team also built in a way to deal with errors. When data have been encoded into the fake DNA, overlapping parts of each section will go onto three separate DNA strands. In order to decode a file, a computer needs data from at least two of the three strands. That way, even if one strand has errors, the other two strands will still have saved the data.

The new system also doesn’t require the same accuracy for all types of data. Relaxing standards for some types of material makes it easier to store large files. For example, text files might require a very high level of precision. In contrast, most people won’t notice if a few pixels are off in yet another picture of their cat.

In lab tests, the system worked very well. The researchers successfully coded video files of people talking about war crimes in the African nation of Rwanda. When they later searched for those files, they found them easily. The group also encoded and reconstructed four image files.

Dean Tullsen is a computer-science engineer at the University of California, San Diego. He chaired the meeting's session in which Ceze’s group described its new system for DNA storage and file retrieval. He says that it’s not clear whether or when DNA data storage might become common. But the University of Washington team has “shown some very exciting potential,” he says. “The best part of the work is that they have actually stored some pictures in synthesized DNA” in the lab, he adds. The team then “read the data back out with no errors.”

Of course, one of those pictures showed a cat.

Power Words

(for more about Power Words, click here)

cell The smallest structural and functional unit of an organism. Typically too small to see with the naked eye, it consists of watery fluid surrounded by a membrane or wall. Animals are made of anywhere from thousands to trillions of cells, depending on their size. Some organisms, such as yeasts, molds, bacteria and some algae, are composed of only one cell.

chemical A substance formed from two or more atoms that unite (become bonded together) in a fixed proportion and structure. For example, water is a chemical made of two hydrogen atoms bonded to one oxygen atom. Its chemical symbol is H2O. Chemical can also be an adjective that describes properties of materials that are the result of various reactions between different compounds.

DNA (short for deoxyribonucleic acid) A long, double-stranded and spiral-shaped molecule inside most living cells that carries genetic instructions. In all living things, from plants and animals to microbes, these instructions tell cells which molecules to make.

DNA sequencing The process of determining the exact order of the paired building blocks — called nucleotides — that form each rung of a ladder-like strand of DNA. There are only four nucleotides: adenine, cytosine, guanine and thymine (which are abbreviated A, C, G and T). And adenine always pairs up with thymine; cytosine always pairs with guanine.

engineer A person who uses science to solve problems. As a verb, to engineer means to design a device, material or process that will solve some problem or unmet need.

environmental science The study of ecosystems to help identify environmental problems and possible solutions. Environmental science can bring together many fields including physics, chemistry, biology and oceanography to understand how ecosystems function and how humans can coexist with them in harmony. People who work in this field are known as environmental scientists.

forensics The use of science and technology to investigate and solve crimes.

gene (adj. genetic) A segment of DNA that codes, or holds instructions, for producing a protein. Offspring inherit genes from their parents. Genes influence how an organism looks and behaves.

molecule An electrically neutral group of atoms that represents the smallest possible amount of a chemical compound. Molecules can be made of single types of atoms or of different types. For example, the oxygen in the air is made of two oxygen atoms (O2), but water is made of two hydrogen atoms and one oxygen atom (H2O).

microbiology The study of microorganisms, principally bacteria, fungi and viruses. Scientists who study microbes and the infections they can cause or ways that they can interact with their environment are known as microbiologists.

nucleotides The four chemicals that, like rungs on a ladder, link up the two strands that make up DNA. They are: A (adenine), T (thymine), C (cytosine) and G (guanine). A links with T, and C links with G, to form DNA. In RNA, uracil takes the place of thymine.

pixel Short for picture element. A tiny area of illumination on a computer screen, or a dot on a printed page, usually placed in an array to form a digital image. Photographs are made of thousands of pixels, each of different brightness and color, and each too small to be seen unless the image is magnified.

polymerase chain reaction (PCR) A biochemical process that repeatedly copies a particular sequence of DNA. A related, but somewhat different technique, copies genes expressed by the DNA in a cell. This technique is called reverse transcriptase PCR. Like regular PCR, it copies genetic material so that other techniques can identify aspects of the genes or match them to known genes.

random Something that occurs haphazardly or without reason, based on no intention or purpose.

random access The process of storing or retrieving a particular data file directly no matter where it’s stored in a medium, instead of having to encode or decode the whole body of data.

smartphone A cell (or mobile) phone that can perform a host of functions, including search for information on the Internet.

species A group of similar organisms capable of producing offspring that can survive and reproduce.

synthetic An adjective that describes something that did not arise naturally, but was instead created by people. Many have been developed to stand in for natural materials, such as synthetic rubber, synthetic diamond or a synthetic hormone. Some may even have a chemical makeup and structure identical to the original.

Readability Score:

  • MS-PS4-3
  • MS-ETS1-2
  • HS-PS4-2
  • HS-LS1-1
  • HS-ETS1-3

Further Reading

S. Ornes. “Genetic memory.’” Science News for Students. February 8, 2013.

S. Ornes. “DNA takes notes.” Science News for Students. June 4, 2012.

Original Meeting Source: J. Bornholt et al. A DNA-based archival storage system. Association for Computing Machinery. Proceedings of the Twenty-First International Conference on Architectural Support for Programming Languages and Operating Systems. Atlanta, Ga., April 6, 2016. p. 637. doi: 10.1145/2872362.2872397.

This is an artist’s representation of a molecule of insulin. This hormone helps the body use its food as energy.

For better sleep, scientists suggest banning screens close to bedtime.

U.S. levels of nitrogen dioxide (NO2), a common traffic-related air pollutant, have fallen. However, neighborhoods where most residents are people of color tended to see the smallest improvements, new data show.

The pumpkin toadlet is a tiny frog that can fit on the tip of a finger. It gives off a soft, cricket-like chirp.

The blood-sucking lifestyle of a mosquito is a hard one for many reasons, including getting smacked to death by a victim.

Weather control is the stuff of science fiction, but scientists have made it at least a little bit real. Whether people should be controlling their weather, though, is another matter.

Computer science: The learning machines: Nature News – Comment

technology, computer science

Computer science: The learning machines

Using massive amounts of data to recognize photos and speech, deep-learning computers are taking a big step towards true artificial intelligence.

Article tools

Three years ago, researchers at the secretive Google X lab in Mountain View, California, extracted some 10 million still images from YouTube videos and fed them into Google Brain — a network of 1,000 computers programmed to soak up the world much as a human toddler does. After three days looking for recurring patterns, Google Brain decided, all on its own, that there were certain repeating categories it could identify: human faces, human bodies and … cats 1 .

Google Brain's discovery that the Internet is full of cat videos provoked a flurry of jokes from journalists. But it was also a landmark in the resurgence of deep learning: a three-decade-old technique in which massive amounts of data and processing power help computers to crack messy problems that humans solve almost intuitively, from recognizing faces to understanding language.

Deep learning itself is a revival of an even older idea for computing: neural networks. These systems, loosely inspired by the densely interconnected neurons of the brain, mimic human learning by changing the strength of simulated neural connections on the basis of experience. Google Brain, with about 1 million simulated neurons and 1 billion simulated connections, was ten times larger than any deep neural network before it. Project founder Andrew Ng, now director of the Artificial Intelligence Laboratory at Stanford University in California, has gone on to make deep-learning systems ten times larger again.

Such advances make for exciting times in artificial intelligence (AI) — the often-frustrating attempt to get computers to think like humans. In the past few years, companies such as Google, Apple and IBM have been aggressively snapping up start-up companies and researchers with deep-learning expertise. For everyday consumers, the results include software better able to sort through photos, understand spoken commands and translate text from foreign languages. For scientists and industry, deep-learning computers can search for potential drug candidates, map real neural networks in the brain or predict the functions of proteins.

“AI has gone from failure to failure, with bits of progress. This could be another leapfrog,” says Yann LeCun, director of the Center for Data Science at New York University and a deep-learning pioneer.

“Over the next few years we'll see a feeding frenzy. Lots of people will jump on the deep-learning bandwagon,” agrees Jitendra Malik, who studies computer image recognition at the University of California, Berkeley. But in the long term, deep learning may not win the day; some researchers are pursuing other techniques that show promise. “I'm agnostic,” says Malik. “Over time people will decide what works best in different domains.”

Inspired by the brain

Back in the 1950s, when computers were new, the first generation of AI researchers eagerly predicted that fully fledged AI was right around the corner. But that optimism faded as researchers began to grasp the vast complexity of real-world knowledge — particularly when it came to perceptual problems such as what makes a face a human face, rather than a mask or a monkey face. Hundreds of researchers and graduate students spent decades hand-coding rules about all the different features that computers needed to identify objects. “Coming up with features is difficult, time consuming and requires expert knowledge,” says Ng. “You have to ask if there's a better way.”

IMAGES: ANDREW NG

In the 1980s, one better way seemed to be deep learning in neural networks. These systems promised to learn their own rules from scratch, and offered the pleasing symmetry of using brain-inspired mechanics to achieve brain-like function. The strategy called for simulated neurons to be organized into several layers. Give such a system a picture and the first layer of learning will simply notice all the dark and light pixels. The next layer might realize that some of these pixels form edges; the next might distinguish between horizontal and vertical lines. Eventually, a layer might recognize eyes, and might realize that two eyes are usually present in a human face (see 'Facial recognition').

The first deep-learning programs did not perform any better than simpler systems, says Malik. Plus, they were tricky to work with. “Neural nets were always a delicate art to manage. There is some black magic involved,” he says. The networks needed a rich stream of examples to learn from — like a baby gathering information about the world. In the 1980s and 1990s, there was not much digital information available, and it took too long for computers to crunch through what did exist. Applications were rare. One of the few was a technique — developed by LeCun — that is now used by banks to read handwritten cheques.

By the 2000s, however, advocates such as LeCun and his former supervisor, computer scientist Geoffrey Hinton of the University of Toronto in Canada, were convinced that increases in computing power and an explosion of digital data meant that it was time for a renewed push. “We wanted to show the world that these deep neural networks were really useful and could really help,” says George Dahl, a current student of Hinton's.

As a start, Hinton, Dahl and several others tackled the difficult but commercially important task of speech recognition. In 2009, the researchers reported 2 that after training on a classic data set — three hours of taped and transcribed speech — their deep-learning neural network had broken the record for accuracy in turning the spoken word into typed text, a record that had not shifted much in a decade with the standard, rules-based approach. The achievement caught the attention of major players in the smartphone market, says Dahl, who took the technique to Microsoft during an internship. “In a couple of years they all switched to deep learning.” For example, the iPhone's voice-activated digital assistant, Siri, relies on deep learning.

Giant leap

When Google adopted deep-learning-based speech recognition in its Android smartphone operating system, it achieved a 25% reduction in word errors. “That's the kind of drop you expect to take ten years to achieve,” says Hinton — a reflection of just how difficult it has been to make progress in this area. “That's like ten breakthroughs all together.”

Meanwhile, Ng had convinced Google to let him use its data and computers on what became Google Brain. The project's ability to spot cats was a compelling (but not, on its own, commercially viable) demonstration of unsupervised learning — the most difficult learning task, because the input comes without any explanatory information such as names, titles or categories. But Ng soon became troubled that few researchers outside Google had the tools to work on deep learning. “After many of my talks,” he says, “depressed graduate students would come up to me and say: 'I don't have 1,000 computers lying around, can I even research this?'”

So back at Stanford, Ng started developing bigger, cheaper deep-learning networks using graphics processing units (GPUs) — the super-fast chips developed for home-computer gaming 3 . Others were doing the same. “For about US$100,000 in hardware, we can build an 11-billion-connection network, with 64 GPUs,” says Ng.

Victorious machine

But winning over computer-vision scientists would take more: they wanted to see gains on standardized tests. Malik remembers that Hinton asked him: “You're a sceptic. What would convince you?” Malik replied that a victory in the internationally renowned ImageNet competition might do the trick.

In that competition, teams train computer programs on a data set of about 1 million images that have each been manually labelled with a category. After training, the programs are tested by getting them to suggest labels for similar images that they have never seen before. They are given five guesses for each test image; if the right answer is not one of those five, the test counts as an error. Past winners had typically erred about 25% of the time. In 2012, Hinton's lab entered the first ever competitor to use deep learning. It had an error rate of just 15% (ref. 4).

“Deep learning stomped on everything else,” says LeCun, who was not part of that team. The win landed Hinton a part-time job at Google, and the company used the program to update its Google+ photo-search software in May 2013.

Malik was won over. “In science you have to be swayed by empirical evidence, and this was clear evidence,” he says. Since then, he has adapted the technique to beat the record in another visual-recognition competition 5 . Many others have followed: in 2013, all entrants to the ImageNet competition used deep learning.

“Over the next few years we'll see a feeding frenzy. Lots of people will jump on the deep-learning bandwagon.”

With triumphs in hand for image and speech recognition, there is now increasing interest in applying deep learning to natural-language understanding — comprehending human discourse well enough to rephrase or answer questions, for example — and to translation from one language to another. Again, these are currently done using hand-coded rules and statistical analysis of known text. The state-of-the-art of such techniques can be seen in software such as Google Translate, which can produce results that are comprehensible (if sometimes comical) but nowhere near as good as a smooth human translation. “Deep learning will have a chance to do something much better than the current practice here,” says crowd-sourcing expert Luis von Ahn, whose company Duolingo, based in Pittsburgh, Pennsylvania, relies on humans, not computers, to translate text. “The one thing everyone agrees on is that it's time to try something different.”

Deep science

In the meantime, deep learning has been proving useful for a variety of scientific tasks. “Deep nets are really good at finding patterns in data sets,” says Hinton. In 2012, the pharmaceutical company Merck offered a prize to whoever could beat its best programs for helping to predict useful drug candidates. The task was to trawl through database entries on more than 30,000 small molecules, each of which had thousands of numerical chemical-property descriptors, and to try to predict how each one acted on 15 different target molecules. Dahl and his colleagues won $22,000 with a deep-learning system. “We improved on Merck's baseline by about 15%,” he says.

Biologists and computational researchers including Sebastian Seung of the Massachusetts Institute of Technology in Cambridge are using deep learning to help them to analyse three-dimensional images of brain slices. Such images contain a tangle of lines that represent the connections between neurons; these need to be identified so they can be mapped and counted. In the past, undergraduates have been enlisted to trace out the lines, but automating the process is the only way to deal with the billions of connections that are expected to turn up as such projects continue. Deep learning seems to be the best way to automate. Seung is currently using a deep-learning program to map neurons in a large chunk of the retina, then forwarding the results to be proofread by volunteers in a crowd-sourced online game called EyeWire.

“Deep learning has the property that if you feed it more data, it gets better and better.”

William Stafford Noble, a computer scientist at the University of Washington in Seattle, has used deep learning to teach a program to look at a string of amino acids and predict the structure of the resulting protein — whether various portions will form a helix or a loop, for example, or how easy it will be for a solvent to sneak into gaps in the structure. Noble has so far trained his program on one small data set, and over the coming months he will move on to the Protein Data Bank: a global repository that currently contains nearly 100,000 structures.

For computer scientists, deep learning could earn big profits: Dahl is thinking about start-up opportunities, and LeCun was hired last month to head a new AI department at Facebook. The technique holds the promise of practical success for AI. “Deep learning happens to have the property that if you feed it more data it gets better and better,” notes Ng. “Deep-learning algorithms aren't the only ones like that, but they're arguably the best — certainly the easiest. That's why it has huge promise for the future.”

Not all researchers are so committed to the idea. Oren Etzioni, director of the Allen Institute for Artificial Intelligence in Seattle, which launched last September with the aim of developing AI, says he will not be using the brain for inspiration. “It's like when we invented flight,” he says; the most successful designs for aeroplanes were not modelled on bird biology. Etzioni's specific goal is to invent a computer that, when given a stack of scanned textbooks, can pass standardized elementary-school science tests (ramping up eventually to pre-university exams). To pass the tests, a computer must be able to read and understand diagrams and text. How the Allen Institute will make that happen is undecided as yet — but for Etzioni, neural networks and deep learning are not at the top of the list.

One competing idea is to rely on a computer that can reason on the basis of inputted facts, rather than trying to learn its own facts from scratch. So it might be programmed with assertions such as 'all girls are people'. Then, when it is presented with a text that mentions a girl, the computer could deduce that the girl in question is a person. Thousands, if not millions, of such facts are required to cover even ordinary, common-sense knowledge about the world. But it is roughly what went into IBM's Watson computer, which famously won a match of the television game show Jeopardy against top human competitors in 2011. Even so, IBM's Watson Solutions has an experimental interest in deep learning for improving pattern recognition, says Rob High, chief technology officer for the company, which is based in Austin, Texas.

Google, too, is hedging its bets. Although its latest advances in picture tagging are based on Hinton's deep-learning networks, it has other departments with a wider remit. In December 2012, it hired futurist Ray Kurzweil to pursue various ways for computers to learn from experience — using techniques including but not limited to deep learning. Last May, Google acquired a quantum computer made by D-Wave in Burnaby, Canada (see Nature 498 , 286 – 288 ; 2013). This computer holds promise for non-AI tasks such as difficult mathematical computations — although it could, theoretically, be applied to deep learning.

Despite its successes, deep learning is still in its infancy. “It's part of the future,” says Dahl. “In a way it's amazing we've done so much with so little.” And, he adds, “we've barely begun”.

References

Mohamed, A. et al. 2011 IEEE Int. Conf. Acoustics Speech Signal Process. http://dx.doi.org/10.1109/ICASSP.2011.5947494 ( 2011 ).

Coates, A. et al. J. Machine Learn. Res. Workshop Conf. Proc. 28 , 1337 – 1345 ( 2013 ).

Krizhevsky, A. , Sutskever, I. & Hinton, G. E. In Advances in Neural Information Processing Systems 25 ; available at http://go.nature.com/ibace6

Girshick, R. , Donahue, J. , Darrell, T. & Malik, J. Preprint at http://arxiv.org/abs/1311.2524 ( 2013 ).

Related stories and links

From nature.com

Neuroelectronics: Smart connections

06 November 2013

Computing: The quantum company

Artificial intelligence finds fossil sites

08 November 2011

Quiz-playing computer system could revolutionize research

15 February 2011

Man and machine hit stalemate

11 February 2003

From elsewhere

Deep learning

Geoffrey Hinton

Yann LeCun

Author information

Affiliations

Nicola Jones is a freelance reporter based near Vancouver, Canada.

Author details

Nicola Jones

Search for this author in

For the best commenting experience, please login or register as a user and agree to our Community Guidelines. You will be re-directed back to this page where you will see comments updating in real-time and have the ability to recommend comments to other users.

Commenting is currently unavailable.

See other News & Comment articles from Nature

Pay for US postdocs varies wildly by institution

The new thermodynamics: how quantum physics is bending the rules

Infusions of young blood tested in patients with dementia

Interstellar visitor, Arctic shipwrecks and a retraction recommendation

Join the disruptors of health science

University systems allow sexual harassers to thrive

Plans to promote German research excellence come under fire

How baby bats develop their dialects

Astronomers race to learn from first interstellar asteroid ever seen

Seeds, sponges and spinal surgery

US environment agency bars scientists it funds from serving on advisory boards

Small group scoops international effort to sequence huge wheat genome

Lessons from first campus carbon-pricing scheme

Spanish government takes control of Catalonian universities

Huge microwave observatory to search for cosmic inflation

Geneticists are starting to unravel evolution's role in mental illness

US March for Science group faces growing pains

Genomic studies track early hints of cancer

Plans rejected for East Antarctic marine park

Ageing satellites put crucial sea-ice climate record at risk​

Social Media Box - AML

The undead

To stay young, kill zombie cells

Killing off cells that refuse to die on their own has proved a powerful anti-ageing strategy in mice. Now it's about to be tested in humans.

Top Content - Article Page

Pay for US postdocs varies wildly by institution

Nature 01 November 2017

Zoology: The joys of spinelessness

Nature 01 November 2017

Books in brief

Nature 01 November 2017

Many junior scientists need to take a hard look at their job prospects

Nature 25 Oct 2017

To stay young, kill zombie cells

Nature 24 Oct 2017

Self-taught AI is best yet at strategy game Go

Nature 18 Oct 2017

Gravitational wave detection wins physics Nobel

The second Renaissance

Colliding stars spark rush to solve cosmic mysteries

The best science news from Nature and beyond, direct to your inbox every day.

Your browser does not support iframes.

CRISPR claims

Bitter CRISPR patent war intensifies

Gene-editing pioneers prepare for next stage of intellectual-property disputes in the United States and Europe.

Many junior scientists need to take a hard look at their job prospects

Permanent jobs in academia are scarce, and someone needs to let PhD students know.

Drug access

China announces plans to fast-track drug approval

Policies are expected to speed up access to medicines and boost the country’s pharmaceutical industry.

Dharma Platform

Out of the Syrian crisis, a data revolution takes shape

Aid organizations have been piloting a nimble approach to cut through the fog of war.

Nature Podcast

This week, undead cells, the strain of PhDs, and the traces of Antarctic instability.

Science jobs from nature jobs

Associate Editor / Senior Editor roles, Nature Research - Talent Pool 2017

Professor and Faculty Positions at the Academy of Medical Sciences (AMS), Zhengzhou University

The Academy of Medical Sciences of Zhengzhou University

Postdocs, Key Lab for Neuroinformation, University of Electronic Sciences and Technology of China

University of Electronic Science and Technology of China (UESTC)

Recruitment of Faculty and Staff for the Center for Stem Cell & Ageing of the Academy of Medical Sciences at Zhengzhou University

The Academy of Medical Sciences of Zhengzhou University

Computer Virus

a current news article about computer technology

Computer Virus News

Experts issue Wi-Fi security warning

Computer experts say they have identified a serious vulnerability in any technology that connects to a Wi-Fi network.

Fri, 27 Oct 2017

An independent investigation has concluded that the debilitating cyberattack that crippled parts of the National Health Service earlier this year could have been prevented with basic security measures

Wed, 25 Oct 2017

Companies in Ukraine, Russia come under cyberattack from new strain of malicious software

Thu, 19 Oct 2017

Ways that seniors can protect themselves from financial fraud at the hands of scammers and, yes, even people they know

Wed, 11 Oct 2017

North Korea's hacking capabilities are "beyond imagination," one former computer expert for the North told ABC News in the wake of Tuesday's report that the nation had stolen secret intelligence documents, including the U.S.-South Korean war strategy. Secret intelligence documents and photos

Wed, 11 Oct 2017

Hackers in North Korea have allegedly stolen a cache of classified military documents from South Korea, according to a South Korean lawmaker. Lee Cheol-hee, a member of South Korea's ruling Democratic Party, initially told local media outlets that the documents were taken in a September 2016 hack

Sat, 30 Sep 2017

White House officials in the early days of the Trump administration were given security briefings on the dangers of using personal devices, like laptops and cell phones, and the risks of using personal email accounts for government work, a source familiar with the program tells ABC News. The

Wed, 13 Sep 2017

Apple fans are getting ready to preorder the new iPhone 8, 8 Plus and iPhone X.

The North American and European energy sectors are being targeted by a "new wave" of cyber attacks by the group known as known as Dragonfly, according to a research report released Wednesday by cyber security firm Symantec . These attacks are specifically focused on the power grid and related

ME 370: Current Issues in Technology and Society

a current news article about computer technology

ME 370: Current Issues in Technology and Society

This page provide a list of topics involving the social, environmental, economic and political impact of technology.

This page is not exhaustive and it is not academically rigorous. I have added content as issues appear in the national, regional and local news. The information on these pages is intended to stimulate your thinking, not constrain it.

The topics are listed without any implication about their order of significance or chronology.

Energy Consumption

The US Energy Information Administration's Energy in Brief is a series of articles on different aspects of energy generation and consumption.

Food and Water are as crucial as energy

Consider this interview with Lester Brown on Fresh Air. Be sure to listen to the end when Mr. Brown is asked whether he is a pessimist or an optimist. He answers by describing the US response to the attach on Pearl Harbor.

Seventy percent of the surface of the earth is covered by water, mostly in the oceans. Those oceans are not that deep compared to the radius of the earth. In May 2012, the USGS published an image that depicted the amount of water on the earth gathered into a single sphere. Jay Kimball modified the USGS image to show a sphere proportional to the amount of fresh water next to the sphere of all water on the earth. Both images are stunning in showing that the answer to the question, "How much water is on earth?" is "not that much".

Drones: Local and Global Issues

Pilotless drones, also known as Unmanned Aerial Vehicles (UAV) are being used regularly by the US Military to spy on and kill declared enemies. Wikipedia provides an overview of the technology.

Military UAVs exist in many configurations. A large fraction of military drones are used for surveillance. Larger UAVs, such as the MQ-9 REAPER are equipped with an assortment of missiles.

Armed UAVs provide distinct military advantages. These systems can hover over targets for long period while pilots on the ground can take breaks or change shifts. The systems allow attacks over great distances and in difficulty terrain without risking the life of the pilot. And currently, the US has a decided advantage in using UAVs as weapons.

In a 2009 book Wired for War, Peter Singer, described an array of robotic weapons, including UAVs. For a quick overview of the issues, refer to Singer's testimony to the US Congress in March 2010. The political implications of military UAVs is starting to register in the mainstream political debate. For example an article in the October Washington Post. describes the "kill list" used in choosing the targets of UAV weapons. The New America Foundation is tracking the deaths by drone attacks.

  • Non-combatants killed in drone strikes: innocent civilians, Legal recourse of Pakistani civilians
  • Issues of national sovereignty
  • Ethics of using drones to assassinate enemies who are not on a conventional battlefield
  • Implications of treating warfare like a video gam
  • Low-cost DIY drones mean that attacks against and within US are likely: copy cats
  • Loss of privacy: neighbors can buy drones with video
  • FAA allowing non-commercial overflight: previously unregulated

The good comes with the bad:

  • Monitoring and catching poachers in Africa
  • Search and rescue
  • Close up "remote" sensing for science

Ethical issues in artificial intelligence and machine reasoning

Gary Marcus, in a 2012 essay in the New Yorker points out the coming ethical dilemma posed by machines that are more competent than us

Within two or three decades the difference between automated driving and human driving will be so great you may not be legally allowed to drive your own car, and even if you are allowed, it would be immoral of you to drive, because the risk of you hurting yourself or another person will be far greater than if you allowed a machine to do the work

Marcus draws the parallel with the need for humans to continue to evolve our ethical understanding. He goes on to argue that we can't wait to figure out how to give machines the ability to reason about ethics.

Building machines with a conscience is a big job, and one that will require the coordinated efforts of philosophers, computer scientists, legislators, and lawyers. And, as Colin Allen, a pioneer in machine ethics put it, "We don't want to get to the point where we should have had this discussion twenty years ago." As machines become faster, more intelligent, and more powerful, the need to endow them with a sense of morality becomes more and more urgent.

Nicholas Carr initiated an interesting discussion with his blog post titled Moral Code (27 Nov 2012). One of the central questions debated is how we will delegate moral/ethical decision-making to machines. The "how" question is not about the mechanics of implementing a decision in software. The "how" question is more about who decides and what decisions are encoded in our machines. This is already done: algorithms (sometimes embodied in mechanical devices) decide when to deploy air-bags or when to release steam from a boiler when the pressure exceeds a threshold. Humans have made the decisions that the computer (or mechanism) executes. The debate in the comment section (do read the comments!) is mostly about whether a future with more complex robots represents a qualitatively different situation. Carr thinks it is. Several of the commentators think it is not. A variation on this argument is to what degree are our safety algorithms the result of some human moral decision-making, and to what degree is it an argument about insurance liability and legal battles over who has to pay for bad outcomes. I'll need to re-read the post and the thread. Please visit that discussion yourself.

As another jumping-off point for further exploration, consider the Singularity Institute.

Synthetic Body Parts: Prosthetics to Electronics

Prosthetics are common for people with missing limbs. As technology advances the use of active prosthetics is becoming more feasible. In sports, the ethical implications of high performance prosthetics are being debated. Oscar Pistorius, who runs on carbon fiber "blades" qualified for the 400m semifinals in the 2012 Olympic games in London. He did not make it to the finals. but he is in the class of elite athletes competing in track events.

Prosthetics are becoming fashionable as well as. begin functional. Aimee Mullins, who was born without fibular bones between her knees and feet, is a model and athlete. She gives an inspiring talk at TED. People who have lost parts of their limbs are even choosing optional amputation.

The use of prosthetics, or any medical technology, to counteract the consequence of disease or accident is broadly acceptable. Use of technology to help someone "return to normal" does not raise too many ethical questions. However, what about going beyond "normal" and using technology for human augmentation? This article at bloomberg.com gives a quick visual overview of technology at the boundary between restoration of function and human augmentation. A podcast by science fiction writer Robert J. Sawyer explores the future. (Direct YouTube link.)

Implantable devices -- heart pacemakers, drug delivery devices, and brain stimulators -- are another area where man/machine interfaces are common and growing in sophistication. Those devices can use Wi-Fi to communicate with clinical equipment outside of the human body. This introduces a vulnerability to a wi-fi attack. According to a recent report, researchers at MIT have developed a device that can safely jam the communication channel between the device and the outside world, thereby preventing unauthorized communication and even an attack on the person wearing the implant.

Electronic implants and augmentation are explored in some recent science fiction writing by Daniel Wilson in his book Amped and by John Scalzi in his book. Old Man's War and others. As the novelist and former robotics research Dan Wilson explains in an article in the Wall Street Journal and in an interview in Wired, humans are experimenting with electronic and mechanical augmentation of their bodies.

Neural-electronic implants require a Brain Computer Interface (BCI) which according to Wikipedia began with work in the 1970s. Early articles on Intelligence Amplification are listed on this Wikipedia page

Here are some technical articles from a very superficial web search

Japan's nuclear reactor failure

Update, 5 July 2012: A report by a panel of experts convened by the Japanese parliament has concluded that "The Fukushima nuclear disaster was the result of 'manmade' failures before and after last year's earthquake. An executive summary of the report is available from the [http://naiic.go.jp/en](National Diet of Japan Fukushima Nuclear Accident Independent Investigation Commission). Note: the "National Diet" of Japan is the bicameral legislature, see, e.g. wikipedia.

The report is described by several news sources. According to a Reuter's article

"The . Fukushima Nuclear Power Plant accident was the result of collusion between the government, the regulators and Tepco, and the lack of governance by said parties," the panel said in an English summary of a 641-page Japanese document.

Regulators, it said, had been reluctant to adopt global safety standards that could have helped prevent the disaster in which reactors melted down, spewing radiation and forcing about 150,000 people from their homes, many of whom will never return.

"Across the board, the Commission found ignorance and arrogance unforgivable for anyone or any organisation that deals with nuclear power. We found a disregard for global trends and a disregard for public safety," the panel said.

It's interesting that the earthquake and subsequent tsunami are not being blamed as the cause of the problems. According to the Businessweek article

The findings couldn't rule out the possibility that the magnitude-9 earthquake damaged the Fukushima Dai-Ichi No. 1 reactor and safety equipment. This is a departure from other reports that concluded the reactors withstood the earthquake, only to be disabled when the ensuing tsunami slammed into the plant.

This finding may have implications for all Japan's atomic plant operators if it leads to tougher earthquake-resistance standards. The operators reported combined losses of 1.6 trillion yen ($20 billion) in the year ended March owing to safety shutdowns of the country's 50 reactors and higher fuel bills when they started up gas and oil-fired plants.

If the Fukushima reactor had already been crippled by the quake when the tsunami hit, it would force regulators to reconsider the seismic criteria that all Japan's plants need to follow, their so-called design basis, said Najmedin Meshkati, a professor of civil engineering at the University of Southern California who has researched nuclear safety in Japan.

It appears that report leads to blaming people, not the reactor design (which, of course was created by people). This allows the argument that the reactors are not inherently unsafe, and that a repeat of this problem can be avoided with better oversight. It is not clear what kind of physical/engineering changes will result from better oversight.

Chernobyl Disaster

Could a low-cost Geiger counter that works with the open source Arduino microcontroller provide a way for people in Japan to monitor their exposure?

Citizen Science

Safecast uses volunteers and other sources of data to share data on nuclear radiation levels in the environment. This is currently of great interest in the north and west of Japan.

Politicization of Science

The existence of global warming is an issue that is settled in the scientific community, yet it continues to be debated by governments and interest groups. A key tool of those who dispute the scientific evidence is to point to the uncertainties that climate scientists cite in their own work. A recent article (April 2012) in the Insurance Journal summarizes recent scientific studies that put the degree of doubt in perspective.

Wind Energy

Too much wind for the grid to handle: According to the Seattle Times the Bonneville Power Administration (BPA) recently said that power from wind farms may not be used during times of high spring run-off from snow melt. Here is a list of BPA articles related to the transmission of energy generated by wind power. Among the documents is a glossy fact sheet from October 2010.

Another issue with wind farms is that the turbines kill wild birds.

Petroleum in shale deposits can be extracted by a process called fracking that involves pumping steam and chemicals deep into the ground to break up the rock and lower the viscosity of the oil. This process also causes ground water pollution. In Pennsylvania residents claim that their drinking water is so contaminated with chemicals that water running from a tap can be set on fire.

For a more general background on Natural gas usage and supplies, consult the US Energy Information Association. The EIA has a good introduction to shale gas.

Richard Pierce, Jr., Professor of Law at the George Washington University, has a brief post giving an overview of the legal and political issues around fracking.

An article in the Wall Street Journal (behind the pay wall) suggested that the EPA will be increasing regulation on fracking. The Houston Chronicle has a similar report.

In a blog post Jeff McMahon identifies a new modified silica called Osorb that removes more than 90 percent of the hazardous chemicals that are dumped into the "produced water". The blog post also describes the fracking process.

Update, 25 August 2011: In May 2011, Secretary of Energy, Steven Chu, appointed a panel of seven to make recommendations for regulations and drilling practices. The report was released on August 11, 2011. An article on the Pro-publica web site provides an overview from a environmental law perspective. The vice president of public and government affairs for Exxon Mobil wrote blog post to complain about the lack of industry input while a group of environmental scientists wrote a letter to complain that six out of seven members of the panel had financial ties to the oil industry.

Update, 12 November 2011: There is concern that the dramatic increase in earthquakes in Oklahoma may be linked to fracking. This is very much an open question and there is only correlation, not evidence of causation.

All Electric Vehicles

All electric vehicles do not pay gas taxes, yet those vehicles put wear and tear on the highways. How will the state of Oregon finance road repair and construction in the future when an increasing number of vehicles that use the roads are not providing revenue via the gas tax?

One suggestion is to use Global Positioning Systems (GPS) to monitor the miles driven by electric vehicles. That is an interesting technological solution, but it introduces the potential for loss of privacy if the details of the GPS records are not kept confidential. How will drivers know that their movements recorded by the DMV are not being passed on to the state police, or worse yet, individuals inside the government with a partisan agenda?

The lack of a charging stations is another obstacle to a transition to electrical vehicles. On May 20, 2011, the Oregonian reported on the chicken-and-egg problem with the deployment of charging stations. An important contributor to the installation of charging station is a company called Ecotality

Ecotality, the San Francisco company awarded $130 million by the U.S. Department of Energy to build the network of public charging stations, was supposed to have 1,100 installed in Oregon by the end of next month [June 2011]. But as of last week it has yet to install a single public station in Oregon.

Without charging stations, consumers will be reluctant to buy all-electric vehicles. Without a sufficient consumer demand for electric vehicles, building out a network of recharging stations will be a substantial risk. Is there an engineering solution to this technology problem?

Update: On June 8, 2011, charging stations in Northeast Portland and Wilsonville were open to the public. Currently these stations operate free of charge. The stations take four hours to charge a car. According to the article in the Oregonian there are currently 30 owners of all-electric vehicles in the city.

Future of Boardman Power Plant

In Fall 2010, PGE wanted to use the Boardman Coal Plant for the remaining 30 years of its design life. Coal is a source of CO2 and other pollutants. PGE is considering switching the Boardman plant to biofuel grown near the plant. The biofuel requires irrigation, which competes with food crops that are also grown in the region.

Using Algorithms to Improve Health Care

As described in this this article, predictive algorithms are just starting to be used in health care. The idea is to use semi-empirical models with real-time monitoring of data from sensors to predict when a patient might be experiencing a medical problem.

Predictive algorithms are used in many other areas, but not yet in medicine. It would seem that there are many benefits to be gained, and some ethical issues to be resolved.

On a simpler technological level, Atul Gawande argues in his book Checklist Manifesto, that following checklists improves outcomes for complex tasks. A checklist is a simple algorithm that does not require a computer for its implementation.

Synthetic Meat

On the NPR Radio Show, Fresh Air, Terry Gross interviews Michael Specter, who published an article in the New Yorker about "In-vitro Meat", i.e. protein tissue with the genetic make-up of animal flesh that is grown without an animal. This activity raises technical and ethical issues. Some see synthetic meat as a way to provide food without killing animals, and with potentially more efficient use of resources. That would reduce the ethical dilemma of raising millions of animals just to slaughter them for food. Against that benefit is the concern that this technology will give pharmaceutical companies even more control over our lives.

FastCompany has a short video on the technology being developed by Beyond Meat to use soy beans to create a synthetic "meat" that has the same texture and "mouth feel" as chicken.

Loss of Privacy Due to Ubiquitous Digital Records

The book Blown to Bits describes how pervasive collection, storage, and analysis of personally identifying information has forever changed our privacy and autonomy. This is a huge area of concern. Several sub-areas could be suitable for projects.

The book can be purchased in a conventional form, and it may be downloaded as a single PDF from www.bitsbook.com.

According to this brief article on the ASEE eGFI (engineering Go-for-it!) web site, police officers in Brazil will be getting sunglasses with built-in facial recognition technology. The glasses analyze faces and match the biometric signature against a database of known criminals or terrorism suspects. The glasses then put a red dot on the image of the suspects face. Other sources of information on this story are here and here

Large Scale Solar Energy Plants

Todd Woody reports in Forbes Magazine about a large scale solar power plant being developed in Nevada. The plant is designed to produce 480,000 MWh of energy per year, which is equivalent to 55 MW of continuous output. The article calls the installation a 110 megawat solar thermal power plant. Millions of gallons of molten salt will be used to store thermal energy captured during the daytime so that the plant can produce electricity around the clock.

The plant design calls for an absorber at the top of a 640 foot tall tower that is surrounded by a field of 17,500 heliostat mirrors. Each mirror moves independently to track the sun and reflect sunlight to the absorber at the top of the tower. The absorber provides the heat source for a steam turbine that produces electricity.

Construction of the plant is made possible by a $737 Million loan guarantee from the federal government. Approximately 500 short term construction jobs will be created during construction, and approximately 50 full time employees will be needed to run the plant.

Large, centralized solar power plants have advantages of efficiency and cost. However, they are still large industrial facilities occupying large land area and requiring substantial material to build and maintain. Locating a centralized power plant in the desert may seem like a no-brainer, but there are environmental consequences to disrupting the fragile desert ecosystem. An article in the Washington Post describes the threat of one such project to desert tortoises.

Environmental Regulations and the Economy

The purpose of environmental regulation is to control activities that harm the environment. In some cases, these activities have financial benefit to individuals or groups. The balance between the economic benefits and environmental protection are often the subject of political debate and power struggles.

A 2011 article in the Washington Post cites data from the Bureau of Labor Statistics that during 2010, 0.3 percent of worker layoffs were attributed to government regulations or interventions, whereas 25 percent of layoffs were caused by a loss in business demand. These estimates were made by executives of the companies that laid off the workers.

The balance between the environment and the economy is complex and easily politicized. The one data set cited here is just one narrow slice of this issue.

Modern technology is changing the way our brains work, says neuroscientist, Daily Mail Online

a current news article about computer technology

Modern technology is changing the way our brains work, says neuroscientist

By SUSAN GREENFIELD

Human identity, the idea that defines each and every one of us, could be facing an unprecedented crisis.

It is a crisis that would threaten long-held notions of who we are, what we do and how we behave.

It goes right to the heart - or the head - of us all. This crisis could reshape how we interact with each other, alter what makes us happy, and modify our capacity for reaching our full potential as individuals.

And it's caused by one simple fact: the human brain, that most sensitive of organs, is under threat from the modern world.

PROFESSOR SUSAN GREENFIELD

Unless we wake up to the damage that the gadget-filled, pharmaceutically-enhanced 21st century is doing to our brains, we could be sleepwalking towards a future in which neuro-chip technology blurs the line between living and non-living machines, and between our bodies and the outside world.

It would be a world where such devices could enhance our muscle power, or our senses, beyond the norm, and where we all take a daily cocktail of drugs to control our moods and performance.

Already, an electronic chip is being developed that could allow a paralysed patient to move a robotic limb just by thinking about it. As for drug manipulated moods, they're already with us - although so far only to a medically prescribed extent.

Increasing numbers of people already take Prozac for depression, Paxil as an antidote for shyness, and give Ritalin to children to improve their concentration. But what if there were still more pills to enhance or "correct" a range of other specific mental functions?

What would such aspirations to be "perfect" or "better" do to our notions of identity, and what would it do to those who could not get their hands on the pills? Would some finally have become more equal than others, as George Orwell always feared?

Of course, there are benefits from technical progress - but there are great dangers as well, and I believe that we are seeing some of those today.

I'm a neuroscientist and my day-to-day research at Oxford University strives for an ever greater understanding - and therefore maybe, one day, a cure - for Alzheimer's disease.

But one vital fact I have learnt is that the brain is not the unchanging organ that we might imagine. It not only goes on developing, changing and, in some tragic cases, eventually deteriorating with age, it is also substantially shaped by what we do to it and by the experience of daily life. When I say "shaped", I'm not talking figuratively or metaphorically; I'm talking literally. At a microcellular level, the infinitely complex network of nerve cells that make up the constituent parts of the brain actually change in response to certain experiences and stimuli.

The brain, in other words, is malleable - not just in early childhood but right up to early adulthood, and, in certain instances, beyond. The surrounding environment has a huge impact both on the way our brains develop and how that brain is transformed into a unique human mind.

Of course, there's nothing new about that: human brains have been changing, adapting and developing in response to outside stimuli for centuries.

What prompted me to write my book is that the pace of change in the outside environment and in the development of new technologies has increased dramatically. This will affect our brains over the next 100 years in ways we might never have imagined.

Our brains are under the influence of an ever- expanding world of new technology: multichannel television, video games, MP3 players, the internet, wireless networks, Bluetooth links - the list goes on and on.

Couple Playing Video Games

But our modern brains are also having to adapt to other 21st century intrusions, some of which, such as prescribed drugs like Ritalin and Prozac, are supposed to be of benefit, and some of which, such as widelyavailable illegal drugs like cannabis and heroin, are not.

Electronic devices and pharmaceutical drugs all have an impact on the micro- cellular structure and complex biochemistry of our brains. And that, in turn, affects our personality, our behaviour and our characteristics. In short, the modern world could well be altering our human identity.

Three hundred years ago, our notions of human identity were vastly simpler: we were defined by the family we were born into and our position within that family. Social advancement was nigh on impossible and the concept of "individuality" took a back seat.

That only arrived with the Industrial Revolution, which for the first time offered rewards for initiative, ingenuity and ambition. Suddenly, people had their own life stories - ones which could be shaped by their own thoughts and actions. For the first time, individuals had a real sense of self.

But with our brains now under such widespread attack from the modern world, there's a danger that that cherished sense of self could be diminished or even lost.

Anyone who doubts the malleability of the adult brain should consider a startling piece of research conducted at Harvard Medical School. There, a group of adult volunteers, none of whom could previously play the piano, were split into three groups.

The first group were taken into a room with a piano and given intensive piano practise for five days. The second group were taken into an identical room with an identical piano - but had nothing to do with the instrument at all.

And the third group were taken into an identical room with an identical piano and were then told that for the next five days they had to just imagine they were practising piano exercises.

The resultant brain scans were extraordinary. Not surprisingly, the brains of those who simply sat in the same room as the piano hadn't changed at all.

Equally unsurprising was the fact that those who had performed the piano exercises saw marked structural changes in the area of the brain associated with finger movement.

But what was truly astonishing was that the group who had merely imagined doing the piano exercises saw changes in brain structure that were almost as pronounced as those that had actually had lessons. "The power of imagination" is not a metaphor, it seems; it's real, and has a physical basis in your brain.

Alas, no neuroscientist can explain how the sort of changes that the Harvard experimenters reported at the micro-cellular level translate into changes in character, personality or behaviour. But we don't need to know that to realise that changes in brain structure and our higher thoughts and feelings are incontrovertibly linked.

What worries me is that if something as innocuous as imagining a piano lesson can bring about a visible physical change in brain structure, and therefore some presumably minor change in the way the aspiring player performs, what changes might long stints playing violent computer games bring about? That eternal teenage protest of 'it's only a game, Mum' certainly begins to ring alarmingly hollow.

Already, it's pretty clear that the screen-based, two dimensional world that so many teenagers - and a growing number of adults - choose to inhabit is producing changes in behaviour. Attention spans are shorter, personal communication skills are reduced and there's a marked reduction in the ability to think abstractly.

This games-driven generation interpret the world through screen-shaped eyes. It's almost as if something hasn't really happened until it's been posted on Facebook, Bebo or YouTube.

Add that to the huge amount of personal information now stored on the internet - births, marriages, telephone numbers, credit ratings, holiday pictures - and it's sometimes difficult to know where the boundaries of our individuality actually lie. Only one thing is certain: those boundaries are weakening.

And they could weaken further still if, and when, neurochip technology becomes more widely available. These tiny devices will take advantage of the discovery that nerve cells and silicon chips can happily co-exist, allowing an interface between the electronic world and the human body. One of my colleagues recently suggested that someone could be fitted with a cochlear implant (devices that convert sound waves into electronic impulses and enable the deaf to hear) and a skull-mounted micro- chip that converts brain waves into words (a prototype is under research).

Then, if both devices were connected to a wireless network, we really would have arrived at the point which science fiction writers have been getting excited about for years. Mind reading!

He was joking, but for how long the gag remains funny is far from clear.

Today's technology is already producing a marked shift in the way we think and behave, particularly among the young.

I mustn't, however, be too censorious, because what I'm talking about is pleasure. For some, pleasure means wine, women and song; for others, more recently, sex, drugs and rock 'n' roll; and for millions today, endless hours at the computer console.

But whatever your particular variety of pleasure (and energetic sport needs to be added to the list), it's long been accepted that 'pure' pleasure - that is to say, activity during which you truly "let yourself go" - was part of the diverse portfolio of normal human life. Until now, that is.

Now, coinciding with the moment when technology and pharmaceutical companies are finding ever more ways to have a direct influence on the human brain, pleasure is becoming the sole be-all and end-all of many lives, especially among the young.

We could be raising a hedonistic generation who live only in the thrill of the computer-generated moment, and are in distinct danger of detaching themselves from what the rest of us would consider the real world.

This is a trend that worries me profoundly. For as any alcoholic or drug addict will tell you, nobody can be trapped in the moment of pleasure forever. Sooner or later, you have to come down.

I'm certainly not saying all video games are addictive (as yet, there is not enough research to back that up), and I genuinely welcome the new generation of "brain-training" computer games aimed at keeping the little grey cells active for longer.

As my Alzheimer's research has shown me, when it comes to higher brain function, it's clear that there is some truth in the adage "use it or lose it".

However, playing certain games can mimic addiction, and that the heaviest users of these games might soon begin to do a pretty good impersonation of an addict.

Throw in circumstantial evidence that links a sharp rise in diagnoses of Attention Deficit Hyperactivity Disorder and the associated three-fold increase in Ritalin prescriptions over the past ten years with the boom in computer games and you have an immensely worrying scenario.

But we mustn't be too pessimistic about the future. It may sound frighteningly Orwellian, but there may be some potential advantages to be gained from our growing understanding of the human brain's tremendous plasticity. What if we could create an environment that would allow the brain to develop in a way that was seen to be of universal benefit?

I'm not convinced that scientists will ever find a way of manipulating the brain to make us all much cleverer (it would probably be cheaper and far more effective to manipulate the education system). And nor do I believe that we can somehow be made much happier - not, at least, without somehow anaesthetising ourselves against the sadness and misery that is part and parcel of the human condition.

When someone I love dies, I still want to be able to cry.

But I do, paradoxically, see potential in one particular direction. I think it possible that we might one day be able to harness outside stimuli in such a way that creativity - surely the ultimate expression of individuality - is actually boosted rather than diminished.

I am optimistic and excited by what future research will reveal into the workings of the human brain, and the extraordinary process by which it is translated into a uniquely individual mind.

But I'm also concerned that we seem to be so oblivious to the dangers that are already upon us.

Well, that debate must start now. Identity, the very essence of what it is to be human, is open to change - both good and bad. Our children, and certainly our grandchildren, will not thank us if we put off discussion much longer.

• Adapted from ID: The Quest For Identity In The 21st Century by Susan Greenfield, to be published by Sceptre on May 15 at £16.99. To order a copy for £15.30 (p&p free), call 0845 606 4206.

Share or comment on this article

Most watched News videos

MOST READ NEWS

Comments 0

Share what you think

No comments have so far been submitted. Why not be the first to send us your thoughts, or debate this issue live on our message boards.

We are no longer accepting comments on this article.

GADGET REVIEWS

Next story

Published by Associated Newspapers Ltd

Part of the Daily Mail, The Mail on Sunday & Metro Media Group

A Short Article on Technology

technology,short article,short articles,communication,internet,blogging,online business,students

A Short Article on Technology

The world has undergone enormous changes over the past decade. We now live in a world where communication is paramount. It seems that everyone and everything is connected in some way.

For school students this has made things much more efficient. Research papers that used to involve hours of laborious effort, can now be researched and documented without ever touching a card catalog or a periodical index. Worlds of information are now available at the click of a mouse.

Questions that people pondered without any answer previously can now simply be typed into any convenient search engine and answered almost immediately. There are countless sites filled with informative short articles all over the Internet. Videos and music can now be seen on demand and news from across the world can be delivered in an instant.

There are some people who worry that the technological revolution and evolution we are experiencing today is moving too fast. There seems to be a loss of privacy in some respects and the specter of a Big Brother society looms larger than it has since 1984. Whether their fears are well founded or not will remain to be seen, but it is unlikely that people will ever willingly give up the almost instant connections to our wired world.

Flying in the face of these fears are individuals who share their worlds through their blogs. What used to be shared with only close friends is now put online for millions of people to see if they should happen upon the blogger's website. Individuals are learning to take advantage of this by using their well placed blogs to sell products and services. The internet has allowed individuals an opportunity to step on to the same playing field as the big boys of business. With the right information and the ability to get it seen, anyone can now reach the masses and share their thoughts, feelings and even sales pitches.

Businesses as well as individuals have come to rely on the Internet as a source of advertising and actual sales. Entire business models have been constructed and thriving based solely on using Internet websites. It is rare today to find a traditional brick and mortar establishment that does not have some type of online presence. Any business that does not adapt and grow to keep up with the newest technology seriously risks being left behind in the wake of their competitors who choose to ride technology's leading edge.

Time will tell where this all will lead. We should make the most of the positive possibilities technology promises, but we should also keep a careful watch on where we are going.