Skip to main content icon/video/no-internet

Cyberethics

Ethics is a set of moral principals that guide decision-making among people in a society, in a profession, or in a business; it is the thought process that helps people to determine what is right and wrong when they are forced to make a choice. In the digital age, computing has made certain kinds of moral decision-making more difficult than ever, precipitating a desire for the formation of new “computer ethics” or “cyberethics,” and for the creation of a corresponding new field of academic study and research. Cyberethics is a complex discipline; its questions out-distance the moral problems and solutions outlined by ancient ethicists like Aristotle, and even those of more modern thinkers like David Hume or Immanuel Kant, because of its unique thinking man/thinking machine dynamic.

The topic of cyberethics is broad. Dartmouth College philosophy professor James H. Moor describes it as “the analysis of the nature and social impact of computer technology and the corresponding formulation and justification of policies for the ethical use of such technology.” In other words, it means studying the right ways and the wrong ways for people to make choices when using computers. For Moor, targets of examination include not just computers themselves, but also the associated peripheral hardware and software. It includes the personal behavior of individuals and the behaviors of a computer-reliant society. He notes the truth behind the cliché that technology often outstrips the ability of ethics to keep pace, and describes a “policy vacuum” at the level of laws and social mores that could help determine how computing technologies should best be used. This social deficit, he says, is exacerbated by a “conceptual vacuum.” What he means is that there is a serious problem with muddled thinking caused by the sheer complexities and sophistication of digital technology, complexities that even the most brilliant human minds cannot fully understand.

Deborah G. Johnson, professor of public policy at the Georgia Institute of Technology, gives a concrete example of such muddled thinking and its impact on computer ethics in her discussion of computer “hackers,” those who use their computing skills to gain unauthorized access to computer systems and networks. Early in computing history, the popular attitude was that hacking was almost a humorous pursuit, a kind of practical joke. But in reality, Johnson asks, how is hacking different than breaking the locks off an office door and rifling through someone's file cabinet?

Perhaps a better analogy would be setting fire to a mall after hours to demonstrate presumed flaws in its automatic sprinkler system. Would this also be acceptable? No one would be physically hurt in either case, although one might assume that the monetary and property damage would be greater in the case of the burned mall building. But that may be a false assumption, considering the costs of the massive denial-of-service attacks of early 2000, or of the destructive Internet “worms” like Code Red from the summer of 2001, which probably caused greater financial losses worldwide than the destruction of any brick-and-mortar store. Yet even today, Johnson notes, much of the technology press continues to treat hacking as more mischievous than criminal. Meanwhile, corporations hungry for computer security offer huge salaries to talented hackers who make their mark through high-profile public hacking incidents. Ethical standards seem indistinct.

...

  • Loading...
locked icon

Sign in to access this content

Get a 30 day FREE TRIAL

  • Watch videos from a variety of sources bringing classroom topics to life
  • Read modern, diverse business cases
  • Explore hundreds of books and reference titles

Sage Recommends

We found other relevant content for you on other Sage platforms.

Loading