Thursday, October 24, 2019
Online Privacy as a Corporate Social Responsibility- an Empirical Study
Business Ethics: A European Review Volume 20 Number 1 January 2011 Online privacy as a corporate social responsibility: an empirical study Irene Pollach Aarhus School of Business, University of Aarhus, Aarhus, Denmark Information technology and the Internet have added a new stakeholder concern to the corporate social responsibility (CSR) agenda: online privacy. While theory suggests that online privacy is a CSR, only very few studies in the business ethics literature have connected these two.Based on a study of CSR disclosures, this article contributes to the existing literature by exploring whether and how the largest IT companies embrace online privacy as a CSR. The ? ndings indicate that only a small proportion of the companies have comprehensive privacy programs, although more than half of them voice moral or relational motives for addressing online privacy. The privacy measures they have taken are primarily compliance measures, while measures that stimulate a stakeholder dialogu e are rare.Overall, a wide variety of approaches to addressing privacy was found, which suggests that no institutionalization of privacy practices has taken place as yet. The study therefore indicates that online privacy is rather new on the CSR agenda, currently playing only a minor role. Introduction Since the 1990s, companies striving to be good corporate citizens have had to devise strategies to address issues such as pollution, energy use, waste production, animal testing, child labor, sweatshops, workforce diversity, or advertising to children.It has become a de-facto standard for very large corporations to publish social reports documenting how they address these issues in the marketplace, the workplace, the supply chain, and the community in order to ful? ll their role as good corporate citizens (Snider et al. 2003). The advent of the Internet has not only revolutionized many business models but has also rede? ned what it means to be a good corporate citizen (Post 2000), as most of the above issues are of little relevance to companies dealing with data and technology.One issue of public concern that has become highly relevant for IT companies is online privacy (De George 2000, Johnson 2006). doi: 10. 1111/j. 1467-8608. 2010. 01611. x Information privacy denotes an individualââ¬â¢s right to decide what information is made available to others (Westin 1967). Privacy is thus guaranteed only if individuals know that data are collected about them and if they have control over this data collection and the subsequent use of the data (Foxman & Kilcoyne 1993, Caudill & Murphy 2000). In the United States, privacy-related legislation exists only for health care, ? ancial services, and children on the Internet (Bowie & Jamal 2006), while many aspects of data collection and user control in electronic commerce are still unregulated (Fernback & Papacharissi 2007). Countries of the European Union, meanwhile, protect privacy more strictly (Baumer et al. 2004), which has proven to be a hurdle for US technology companies operating in Europe. In 2008, for example, technology giant Google encountered problems in several European countries with its data handling practices (Oââ¬â¢Brien 2008).Despite legislative efforts in Europe, data privacy violations have occurred in a number of 88 r 2010 The Author Business Ethics: A European Review r 2010 Blackwell Publishing Ltd. , 9600 Garsington Road, Oxford, OX4 2DQ, UK and 350 Main St, Malden, MA 02148, USA Business Ethics: A European Review Volume 20 Number 1 January 2011 large organizations, including, for example, the largest German bank, DeutscheBank (Neate 2009), or T-Mobile UK (Wray 2009). The problems with privacy legislation are that it is dif? ult to identify violations of these laws and that the law may lag behind what is technologically feasible. For the above reasons, global companies have some discretion over how much privacy they grant users and how much they reveal about their data handlin g practices to their users. This discretion adds extra complexity to the moral issue of whether companies take advantage of their powerful position by collecting and using data from users to further their own business interests, for example by sending out unsolicited promotional e-mails or selling user data (Pollach 2005).The discretion companies can exercise when it comes to information privacy and the ethical implications of this discretion entail that information privacy is a question of corporate morality. While theoretical work on corporate social responsibility (CSR) suggests that privacy could be a meaningful addition to a corporate CSR program, little is known about corporate practices. This paper therefore sets out to explore whether and how companies whose core business is based on data and technology are embracing information privacy as a CSR. The ? dings suggest that information privacy is emerging as an element of CSR programs, but that there is a great deal of variety regarding the adoption of privacy as a CSR. The paper ? rst discusses the moral issues behind information privacy on the Internet, reviews the literature on corporate responses to peopleââ¬â¢s privacy concerns, and then looks at the literature on privacy as a CSR. After describing the sample and the methodology underlying this study, the results are presented and their implications are discussed. The ethics of information privacyThe very core of electronic and mobile commerce revolves around technology, digitization, and the exchange of information, which poses a number of ethical problems (Zonghao 2001). A particular challenge to information handling in electronic commerce is the trade-off between collecting data for the sake of transparency and not collecting data for the sake of privacy (Introna & Pouloudi 1999). Another challenge is the trade-off between collecting data for the sake of pro? ts and not collecting data for the sake of privacy.As commercial transactions on the I nternet or through mobile phones are commonly based on credit-card payments and the shipment of goods to the buyerââ¬â¢s home address, the balance is tipped towards the need for disclosure rather than the safeguard of privacy. However, companies collect not only personally identifying information (PII) from transactions but also collect PII when users register themselves, use online services, participate in sweepstakes or surveys, or send inquiries to the company. In addition to PII, companies collect anonymous click-stream 1/2 data and compile anonymous user pro? es when Internet users navigate the companiesââ¬â¢ websites (Kelly & Rowland 2000). Through the collection of IP addresses, PII can also be combined with anonymous click-stream data in order to obtain very comprehensive user pro? les (Payne & Trumbach 2009). The easier access to and increased mobility of data have made information a commodity that is bought and sold by data brokers (Spinello 1998). It is therefore al so possible for companies to buy datasets of user information from data brokers and merge them with the data they have collected themselves.Companies may use the data they collect from customers and visitors on their websites merely to execute transactions, recognize users when they return to the site, and improve their website design based on usersââ¬â¢ interests. But companies may equally use such data for purposes other than those they were collected for. For example, they may target banner ads at users, harass users with unsolicited commercial e-mails, or share this information with third parties (Han & Maclaurin 2002). A growing body of literature documents peopleââ¬â¢s concerns about privacy violations in online transactions (e. . Culnan & Armstrong 1999, Phelps et al. 2000, Sheehan 2002, Norberg & Horne 2007, Norberg et al. 2007). Essentially, these concerns stem from the imbalance in power between companies as data collectors and users as data providers. While companie s have superior knowledge of what user data are collected and how they are r 2010 The Author Business Ethics: A European Review r 2010 Blackwell Publishing Ltd. 89 Business Ethics: A European Review Volume 20 Number 1 January 2011 handled, users may not even be aware that data are collected, let alone that they are combined into user pro? les. hus not suited to enhance user privacy or engender trust among Internet users. Corporate response to privacy At the turn of the century, some companies began to introduce chief privacy of? cers (Awazu & Desouza 2004). Their tasks include gathering information about social and legal aspects of privacy, devising the companyââ¬â¢s privacy strategy, disseminating information about corporate data handling practices to internal and external stakeholders, and representing the companyââ¬â¢s commitment to privacy (Kayworth et al. 2005). Another corporate response to information privacy is privacy policies posted on commercial websites (Sama & Sho af 2002).The original idea behind privacy policies on websites was that companies would disclose how they handle the data they collect from users, while users would carefully read through the explanation of the companyââ¬â¢s data handling practices, understand their consequences, and then make an informed decision about divulging personal data or not (Ciocchetti 2007). In reality, privacy policies contain legalese, tech-speak, and other obfuscating language patterns that obscure questionable data handling practices (Pollach 2005, Fernback & Papacharissi 2007).Internet users have been found not to read privacy policies for the above reasons (Milne & Culnan 2004). Privacy policies are sometimes supplemented with privacy seals awarded by private-sector institutions (e. g. BBBOnline, TRUSTe, WebTrust) or accounting ? rms. These seals indicate that companies comply with responsible standards of data handling, as de? ned by the awarding institution (Smith & Rupp 2004). Consumers still have to read and understand the privacy policy, as the seal alone does not guarantee that the data handling practices of the company comply with an individualââ¬â¢s privacy preferences (Rifon et al. 2005).The problem with privacy seals is also that they do not effectively protect users from privacy breaches. The sealawarding institution may not know about a privacy breach or, if it does learn about it, can only revoke the seal, but has no means to help people regain lost privacy (Shapiro & Baker 2001). These measures are Information privacy as a CSR Carroll (1979) categorized corporate social responsibilities into economic, legal, ethical, and philanthropic responsibilities, arguing that making a pro? t is the quintessential responsibility of companies, together with their adherence to legal regulations. According to this classi? ation, information privacy can be categorized as an ethical responsibility, given that legislation is insuf? cient to govern corporate decision making i n all areas of data handling. This is elaborated on by Mintzberg (1983), who suggested that areas where CSR comes into play are those ââ¬Ëwhere existing legislation needs compliance with its spirit as well as its letter [and] where the corporation can fool its customers or suppliers or the government through its superior knowledgeââ¬â¢ (p. 12). If a company decides to address information privacy, it may not just do so because privacy is an ethical corporate responsibility. Rather, Aguilera et al. 2007) argue that companies accept responsibility for social issues for three different reasons: (1) moral reasons determined by morality-driven values; (2) relational reasons driven by the companyââ¬â¢s concern about stakeholder relationships; and (3) instrumental reasons driven by corporate self-interest. Moral motives are enacted particularly by individuals with organizational decision-making power who have strong morality-based values. Relational motives are grounded in a compan yââ¬â¢s desire to promote and balance stakeholder interests, thereby building trust, maximizing stakeholder wealth, and gaining social legitimacy (Aguilera et al. 007). Instrumental approaches are self-interest driven, seeking to achieve greater competitiveness and protecting the corporate reputation (Aguilera et al. 2007). The latter approach corresponds to Jonesââ¬â¢ (1995) argument that companies that manage to earn the trust of their stakeholders will be able to secure a competitive advantage through savings on monitoring costs, bonding costs, transaction costs, and search costs arising from managing the various corporate stakeholder groups. Instrumental motives 90 r 2010 The Author Business Ethics: A European Review r 2010 Blackwell Publishing Ltd.Business Ethics: A European Review Volume 20 Number 1 January 2011 can also be driven by the desire to preempt costly government regulations (Aguilera et al. 2007). The strategy literature follows the instrumental approach to CS R, arguing that companies to which a particular responsibility is highly relevant can bene? t from integrating this responsibility into their overall strategies. Burke & Logsdon (1996) list the following conditions in order for CSR to bring strategic advantages to the ? rm: the chosen CSR issue is central to the companyââ¬â¢s mission, is voluntarily embraced, brings bene? s to both the ? rm and to the public at large, is addressed in a proactive manner, and is visible to external stakeholders. It has also been argued that CSR initiatives can bring sustainable competitive advantages in the form of a ? rst-mover advantage (Lieberman & Montgomery 1998). However, for this advantage to emerge, the company must not only be the ? rst one to address a particular CSR comprehensively but must also continuously seek to enhance what it has achieved in order to secure this advantage (Tetrault Sirsly & Lamertz 2008).The strategy literature therefore suggests that companies in the information t echnology industry could bene? t from embracing online privacy as a CSR, especially if they make this commitment visible to external audiences. Although theory suggests that privacy could be a relevant CSR theme for particular companies, very few empirical studies have addressed the link between information privacy and CSR. They include Sharfman et al. ââ¬â¢s (2000) survey among managers on how important they consider a number of social issues, including the protection of privacy.However, in the exploratory factor analysis they conducted, privacy was eliminated from further analyses. Fukukawa & Moon (2004) included information privacy as an indicator of CSR in their study of CSR activities reported by companies in Japan. In addition, Chaudhriââ¬â¢s (2006) case study of global citizenship at Hewlett-Packard mentions privacy as one area the company has included in its CSR agenda. In previous theoretical work, Carroll (1998) has highlighted the protection of online privacy rights as one area where the law lags behind ethical thinking and morality comes into play.Finally, Post (2000) examined the changing role of corporate citizenship in the 21st century and pointed to customer privacy as a new issue of CSR. To date, there is no article that empirically studies in what ways information privacy is actually addressed as a CSR. Research design This study explores whether and how companies are embracing online privacy as a social responsibility, focusing on what measures they claim to have taken and how they communicate these to their external stakeholders in their CSR disclosures.In view of the lack of previous research in this area, this study is exploratory in nature. Accordingly, its goal is to identify the variety of corporate practices rather than to compare and contrast companies. The starting point for the analysis are the three processes of CSR included in Basu & Palazzoââ¬â¢s (2008) process model of sense-making: (1) the reasons a company states for engaging in speci? c CSR activities, (2) the kind of behavior a company displays to live up to its CSR commitments, and (3) the way in which a company regards its relationships with its stakeholders.This section ? rst describes the sample and the data and then goes on to explain the methodology that was applied to analyze the data. Sample The sample consists of the largest companies from IT-related industries, as they are most closely intertwined with information through the hardware, software, or services they provide. To them, information privacy could be a meaningful strategic element of their CSR programs in two different ways. First, they may embrace privacy as a social responsibility in the way they collect and use data.Second, technology does not just violate privacy, it can also enhance privacy. Accordingly, IT companies may engage in corporate social innovation and develop privacy-enhancing products or commit themselves to educating consumers about privacy protection. Clea rly, other large companies, such as retailers, operate online as well, but were not considered for this study, as data and information are not at the core of their activities. Large companies were chosen, as these companies are believed to serve as lead innovators in their industries. All IT-related companies from Europe 2010 The Author Business Ethics: A European Review r 2010 Blackwell Publishing Ltd. 91 Business Ethics: A European Review Volume 20 Number 1 January 2011 and the United States listed among the Fortune Global 500 and the ? rst 1,000 companies of the Forbes 2000 company rankings were included in the sample. Neither of the two rankings includes ââ¬Ëinformation technologyââ¬â¢ as an industry. Rather, both include a number of industries that deal with information and technology. These include Computer and Data Services, Computer Software, Computers & Of? e Equipment, Network and Other Communications Equipment, and Telecommunications from the Fortune Global 500 list and Software & Services, Technology Hardware & Equipment, and Telecommunications Services from the Forbes 2000 list. A few IT companies listed in these two rankings could not be included in the analysis, as they had been acquired by another company since the publication of the rankings. Also, the two rankings overlap to a substantial extent, so that the ? nal sample amounted to a total of 95 IT companies. On each companyââ¬â¢s website, the CSR section was accessed.If there was no such section, sections dedicated to the company background, mission and values, or ethics were accessed. The goal was to download all texts pertaining at least loosely to CSR and, if available, the latest CSR report. An important criterion was that privacy-related information was collected only if it was framed as a CSR issue. Privacy policies, which are a standard element of every commercial website, were not collected, as their existence alone does not represent a commitment to social responsibility. Of the 95 companies in the initial sample, 30 companies mention privacy in their CSR discourse.The analysis is thus based on these companies (see Appendix A). Their texts range from 21 to 2,367 words in length. Methods This exploratory study draws on both a positivist approach and a constructivist approach in order to look at the data as holistically as possible (cf. Jick 1979). When studying textual data, the fundamental difference between the two traditions is that the positivist tradition sees language as a transmitter of information, while the social constructionist tradition holds that people consciously and unconsciously create social realities when they use language. Accordingly, the textual data were ? st studied using quantitative content analysis, which systematically records the frequency of particular content features. Because of its quantitative, systematic nature, content analysis de-contextualizes the words from the discourse that is examined and therefore has no mean s to interpret its ? ndings within a wider context. The ? ndings of the content analysis were therefore combined with a discourse analysis and are presented together. The combination of content analysis and discourse analysis has also been suggested by researchers in linguistics (van Dijk 1985, Herring 2004), sociology (Markoff et al. 974), and information systems (Trauth & Jessup 2000). In this study, the results of both analyses together provide a much richer picture of corporate practices than one analysis alone could furnish. This is important, given the absence of previous research on privacy and CSR. Content analysis systematically condenses texts into content categories by applying a coding scheme that produces quantitative indices of textual content (Krippendorff 1980, Weber 1985, Kolbe & Burnett 1991, Neuendorf 2002).The content analysis conducted as part of this study records in a systematic and exhaustive manner which companies in the sample have implemented which measure s to improve user privacy. The approach chosen for this analysis uses factual codes, which capture precisely de? ned facts, as opposed to thematic codes, which capture themes addressed in a prede? ned textual unit (Kelle & Laurie 1995). The factual codes pertain to privacy measures companies have actually taken, but exclude those that companies plan to implement in the future.With no existing coding scheme available, a preliminary coding scheme was developed from the data by examining the texts in the sample inductively (cf. Strauss & Corbin 1990) for measures that companies have taken to secure user privacy. Overall, 41 different measures were identi? ed. The measures were recorded dichotomously as being either present (1) or absent (0). They are listed in Table 2 together with the results. The qualitative approach chosen here was discourse analysis, following a social constructionist tradition, which views discourse as a social action that is shaped by and shapes the context in wh ich it occurs (van Dijk 1997a).Discourse analysis is a 92 r 2010 The Author Business Ethics: A European Review r 2010 Blackwell Publishing Ltd. Business Ethics: A European Review Volume 20 Number 1 January 2011 method of textual analysis that focuses on how and why language is used in a particular way (van Dijk 1997b). It is based on the premise that people intentionally and unintentionally construct social realities when they engage in discourse. They use language in their roles as members of particular social groups, professions, institutions, or communities but also construct such roles when they use language in social situations (van Dijk 1997a).Similarly, organizational texts can be constructive and constitutive of realities just like text or speech of individuals (Fairclough 2005). Discourse analysis typically pays attention to language features such as repetitions, pronouns, passive voice, nominalizations, modal verbs, agentââ¬âpatient relations in sentences, and attitudi nal lexis in order to study the roles assigned to the participants in the discourse, the power relations between them, and the foregrounding or the backgrounding of concepts and events.The discourse analysis conducted here examines how companies present themselves as responsible companies when it comes to privacy and data handling. Basu & Palazzoââ¬â¢s (2008) process model of CSR has guided the analysis and therefore also provides the structure of the results section. Accordingly, the results section starts with the companiesââ¬â¢ reasons for including privacy in their CSR programs, then presents privacy measures companies have taken as part of their CSR initiatives, and ultimately studies the relationships with the various stakeholders that are affected by the companyââ¬â¢s privacy practices.The reasons for including privacy and the stakeholder relationships are analyzed in the form of a discourse analysis. The analysis of the privacy measures is based on a content analysi s, but enhanced with qualitative insights, as needed. Aguilera et al. ââ¬â¢s (2007) classi? cation of moral, relational, and instrumental CSR motives. Table 1 shows this categorization together with the text passages where these motives were expressed.The moral motives found include the understanding that Internet users have privacy rights, which the company wants to observe, and the acknowledgement that the company has the responsibility to protect the data they gather from Internet users. Relational motives include the recognition that customers have a desire for privacy, which the company seeks to meet, and the expectation that privacy protection will help the company win customersââ¬â¢ trust. Ultimately, one company expects to bene? t from its privacy program in that it expects to gain a reputational advantage from privacy protection. CSR behaviorThe content analysis revealed 41 different measures companies had taken to support user privacy (see Table 2). They have been gr ouped into four categories, which are discussed below. One company has implemented 19 of these measures, and nine companies have implemented eight, nine, or 10 different measures. At the other end of the spectrum, there are two companies that have not implemented a single measure, but still talk about privacy in the context of CSR. Further, eight companies have implemented one or two measures, and nine companies have implemented between three and seven measures.Most commonly, a measure was taken by only one company (19 measures) or two companies (six measures). The measure taken most frequently was taken by 15 companies. Thus, there is a broad variety in how companies address privacy. It is also worth noting that it is not necessarily the biggest companies in the industry that have taken lead roles in protecting user privacy. When ranking all companies according to their ranks on the Forbes 2000 and the Fortune Global 500 lists, one can see that the company with the highest number o f privacy measures ranks among the top three on both the Forbes and the Fortune list.The other two companies among the top three in the Fortune and Forbes rankings have implemented only one and three measures, respectively. The three companies Results Reasons for privacy as CSR The texts were examined for indications of why the companies include privacy in their CSR programs. Only 13 companies voiced their motivation for engaging in privacy protection, presenting different reasons why they engage in CSR. The communicated motives have been grouped according to r 2010 The Author Business Ethics: A European Review r 2010 Blackwell Publishing Ltd. 3 Business Ethics: A European Review Volume 20 Number 1 January 2011 â⬠¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â ¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦.. Table 1: Communicated motives for corporate privacy programs Motive Moral Explanation Three companies acknowledge that people have a right to privacy Quotations ââ¬ËTo us, the right to privacy includes the right of individuals to have a voice in the use and dissemination of their personal information. ââ¬ËA person has the right to control what information about him or her is collected and to determine how that information is used. ââ¬â¢ ââ¬ËCon? dentiality and security of consumer data . . . are areas safeguarded by PT in order to respect the freedom and basic rights of each individualââ¬â¢ ââ¬ËWe feel a strong responsibility to help ensure a safer, more enjoyable Internet, while addressing the challenges to privacy and security posed by todayââ¬â¢s new media. ââ¬â¢ ââ¬ËCompanies have a responsibility to ensure that the information they hold about their customers and employees is protected, stored, transferred, and used i n a responsible manner. ââ¬ËMicrosoft takes seriously its responsibility to help address the security and privacy challenges of the information-based society, from viruses and spyware to spam and online identity theft. ââ¬â¢ ââ¬ËRespect for privacy is part of our commitment to observe high standards of integrity and ethical conduct in all our operationsââ¬â¢ ââ¬ËProtecting our customersââ¬â¢ privacy is a priority. We understand and respect your desire to protect your personal information. ââ¬â¢ ââ¬ËThe protection of personal information is a very high expectation among our customers, and to meet it, we . . .. ââ¬ËExternally, Sabre is committed to building customer relationships based on trust, and that includes recognizing the importance of protecting personal information. ââ¬â¢ ââ¬ËConsumer trust and con? dence is critical to Ciscoââ¬â¢s business and to any technology and Internet-related business; as a result, the industry must protect citizensà ¢â¬â¢ privacy. ââ¬â¢ ââ¬Ë[We] have to acquire a ââ¬Ëlicense to operateââ¬â¢ by conducting our business in a decent and responsible way. ââ¬â¢ ââ¬ËSecurity and reliability form the basis of Telekom Austria Groupââ¬â¢s stable and successful customer relationships.The Group therefore gives top priority to protecting the integrity and con? dentiality of sensitive data. ââ¬â¢ ââ¬ËMain opportunities: Enhance customer and employee trust, . . . support brand/reputation. ââ¬â¢ Four companies hold that they have a responsibility to protect the data they gather from Internet users Relational Two companies recognize that customers have a desire for privacy that needs to be met Four companies view privacy protection as a means to winning customer trust InstrumentalOne company states that it expects to gain a reputational advantage from its privacy program â⬠¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦.. that have implemented the second highest number of privacy measures occupy ranks #77, #87, and #173 on the Fortune Global 500 list and ranks #49, #518, and #782 on the Forbes 2000 list, which indicates that it is not necessarily the biggest companies in the IT industries that embrace information privacy.An investigation of the relationship between the number of measures taken and length of the privacy text on the corporate website revealed a correlation of 0. 77. This suggests that text length is an indicator of how important the issue is to a company. At the same time, it also shows that the companies generally do not talk at length about privacy without having taken relevant measures. One category of measures pertains to the companiesââ¬â¢ internal affairs. They address processes, employee conduct, and, to a small extent, suppliers.The measures mentioned most frequently are the 94 r 2010 The Author Business Ethics: A European Review r 2010 Blackwell Publishing Ltd. Business Ethics: A European Review Volume 20 Number 1 January 2011 â⬠¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦.. Table 2: The content of corporate privacy programs Internal Physical protection of data Procedural/administrative protection of data Electronic/technical protection of data Privacy policy Privacy is part of the code of conduct Privacy of? e(r) Privacy board/working group Employee training Disciplinary action for employee misconduct Privacy newsletter for employees Employee monitoring Privacy included in employment contract Onl ine resources for employees Ethics hotline for privacy questions Internal privacy campaign Limited employee access to data Online reporting of privacy incidents Regular review of systems and processes Regular review of privacy policy Binding third parties to privacy agreements Reviewing third-party privacy practices Privacy newsletter for customers Guidance/information for consumers Resources for parental control & child safety Privacy e-mail address Integrating privacy into product development Privacy blog Involving stakeholders in design of privacy policy Supporting IS education at schools and universities Publishing privacy research papers Supporting law making Supporting industry self-regulation Working with industry Working with governments Working with NGOs, think tanks Political action committee (PAC) Compliance with laws Exceeding laws Compliance with Safe Harbor Compliance with GRI Privacy seal 6 2 3 15 8 7 3 9 1 1 1 1 1 1 1 3 1 5 3 5 2 1 10 5 2 8 1 1 1 1 2 1 5 6 10 1 11 1 4 1 4 79 External 30 Collaborations 25 Compliance 21 â⬠¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦. existence of a privacy policy and privacy training, privacy being part of the code of conduct, privacy of? cers, physical data protection, and regular review of systems and processes. All other measures taken internally were taken by one, two, or three companies each, for example measures encouraging employees to report privacy violations and to comply with relevant guidelines. Two different measures pertaining to suppliers or other third parties were identi? ed, namely that the company reviews privacy practices of those partners and that these outsiders are bound to a privacy agreement.The second category of measures contains those directed towards external stakeholders. They include r 2010 The Author Business Ethics: A European Review r 2010 Blackwell Publishing Ltd. 95 Business Ethics: A European Review Volume 20 Number 1 January 2011 primarily guidance for consumers regarding Internet privacy. Five companies take measures that address parentsââ¬â¢ concerns about their childrenââ¬â¢s privacy. In addition to providing information, companies also solicit consumer feedback on privacy matters. Two companies highlight that they have an e-mail address to which people can send privacy concerns and inquiries, and one company involves stakeholders in the design of its privacy policy.The inclusion of privacy considerations in product development was embraced by eight companies. Another group of measures pertain to the participation in industry initiatives and collaborations. Ten companies mention a variety of privacy forums, centers, associations, think tanks, and institutes in which they are involved, in cluding for example, the Electronic Privacy Group, the European Privacy Of? cers Forum, or the Liberty Alliance. Some of them also state that they cooperate with other companies and governments. However, the nature of this cooperation remains unclear, and in some places, the cooperating institutions are not even mentioned.Ultimately, a few US companies express their views on privacy legislation. As part of the measures they have taken, three companies take an active stance for either privacy legislation or self-regulation. Both of these viewpoints are visions at this point, as there is neither privacy legislation nor a functioning model of self-regulation in the United States. The two viewpoints are as follows: ââ¬ËWe also believe that governments must ? nd improved ways to enforce laws against data breach, misuse and fraud, and help consumers pursue those who mishandle their personal information. . . . HP was one of the ? rst companies to embrace the idea of a comprehensive U. S . privacy law. ââ¬ËBecause disparate and multiple privacy rules place a heavy burden on global companies, we support a model of industry self-regulation (as opposed to government intervention) in which innovative tools give consumers greater choice in both protecting their personal data and understanding how it may be collected and used. ââ¬â¢ they comply with all relevant privacy laws. As compliance with laws is a legal rather than an ethical responsibility according to Carrollââ¬â¢s (1979) classi? cation of corporate responsibilities, only going beyond the law can qualify as a CSR initiative. Dressing up a legal responsibility as an ethical responsibility casts doubt over the sincerity of these efforts.In fact, one of these 11 companies has implemented no other privacy measure apart from legal compliance. There is only one company that vows to exceed legal requirements: ââ¬ËHP is pioneering an approach to the protection and responsible use of personal information. This effort goes beyond compliance with the law. ââ¬â¢ Only a minority of companies have adopted the privacy standards of outside organizations, such as GRI or privacy seal programs. Stakeholder relationships The measures identi? ed above relate to a number of internal and external stakeholder groups, including employees, consumers, parents, industry, suppliers, governments, advocacy groups, and the community at large.However, the analysis of the measures does not reveal anything about the relationships with stakeholders, and in some cases, the stakeholder group to which a particular measure was addressed was not even mentioned. This section therefore focuses speci? cally on the stakeholder groups to which the companies express some form of consideration. This could be in the form of protection measures, information provision, cooperation, or merely by expressing an awareness of their stakes in privacy. In addition to an account of these overt commitments to stakeholders, a discourse analysis is used to uncover discursively constructed relationships with stakeholders. Table 3 lists the various stakeholder groups identi? d, together with their stake in privacy, the number of companies that made a commitment toward each stakeholder group, and an example of such a commitment. This table is different from the results presented in Table 2 in that it was not concrete actions that guided this analysis, but the awareness of stakeholder concerns. We ? nd that companies recognize primarily the stakes of their customers and employees, who exercise a direct and economic in? uence on the company and can therefore be labeled Even companies that do not take a stance on the legislation vs. self-regulation debate emphasize compliance with legislation. Eleven companies state that 96 2010 The Author Business Ethics: A European Review r 2010 Blackwell Publishing Ltd. Business Ethics: A European Review Volume 20 Number 1 January 2011 â⬠¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦.. Table 3: Addressing stakeholder concerns Stakeholder GroupStake # Primary Customers/ Protection of 25 Users their data Employees Suppliers/ Vendors Training Guidelines 14 6 Example ââ¬ËIn order to help our customers address these issues, we have begun to develop guidance documents to help customers understand which parts of our technology may have privacy applications. ââ¬ËWe work hard to ensure that Sun employees have the information they need to apply our privacy protection standards in their work. ââ¬â¢ ââ¬ËWhen it is necessary for business reasons to share a personââ¬â¢s information with third parties such as network service providers and marketing campaign partners, we work together to ensure that we main tain the highest privacy standards. ââ¬â¢ ââ¬ËWe met with government of? cials and regulators in all regions to understand their concerns and initiatives and to help them fully appreciate the potential implications for privacy of new technologies. ââ¬â¢ ââ¬ËWe are working with other industry participants . . . to develop solutions that help us reach both of these objectives. ââ¬ËIn 2007, we formed our Stakeholder Advisory Council (SAC) comprising respected experts from a variety of nongovernmental organizations. ââ¬â¢ ââ¬ËSymantec is committed to helping parents keep their kids cybersafe. We believe that in the same way that we educate our children about the risks of drugs, smoking, or violence, it is critical that we educate them about the importance of safe computing. ââ¬â¢ ââ¬ËWe tap this internal resource to offer programs that bene? t our local schools and communities. We are also in the process of implementing an employee-led education program. â⬠⢠Secondary Government Industry Advocacy groups Parents Compliance with laws; expertise in data handling Cooperation Cooperation 6 6 3 Protection of 5 their childrenââ¬â¢s data Expertise 1 Schools/ communities â⬠¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦. ââ¬Ëprimary stakeholdersââ¬â¢ according to Ansoff (1965). However, there are also companies that talk about privacy in a CSR context, but do not voice a commitment to these two primary stakeholder groups. Of the 30 companies, ? ve do not state that they do anything to improve the privacy situation of their customers and 16 do not make such a commitment toward their employees. Suppliers, who are also primary stakeholders, are addressed to a smaller extent. We can also see that the companies in the sample largely neglect their secondary stakeholders, i. e. those groups who do not directly in? uence a companyââ¬â¢s core business (Ansoff 1965).Only a maximum of six companies interact with each secondary stakeholder group, such as parents or governments. On the surface, all companies studied engage in a discourse characterized by care and concern for privacy. In particular, emotion-laden words like help, understand, respect, concern, and safe abound across all texts studied. For example: ââ¬ËProtecting our customersââ¬â¢ privacy is a priority. We understand and respect your desire to protect your personal information. ââ¬â¢ ââ¬ËAnd as the 24 A 7 demands of the Internet Age threaten to overwhelm customers with complexity, they need trusted and reliable companies to help them make sense of technology and put it to use to make their lives better. ââ¬â¢The tone becomes even more concerned when companies address their relationship with parents and children: ââ¬ËWe understand the responsibility and concern of parents who worry about their childrenââ¬â¢s exposure to inappropriate content and potentially dangerous interactions on the Web. ââ¬â¢ ââ¬ËProtecting our children . . . We believe that in the same way that we educate our children about the risks of drugs, smoking, or violence, it is critical r 2010 The Author Business Ethics: A European Review r 2010 Blackwell Publishing Ltd. 97 Business Ethics: A European Review Volume 20 Number 1 January 2011 that we educate them about the importance of safe computing. ââ¬â¢ In the second example, the pronoun ââ¬Ëwe/ourââ¬â¢ adds to the concerned tone by promoting a sense of collegiality and shared affection.The same is also achieved in other places, when companies use this inclusive form of ââ¬Ëweââ¬â¢ to reduce the distance between themselves and their outside stakeholders: ââ¬ËOur individual sensitivities about how our information is tr eated . . . are not uniformââ¬â¢ or ââ¬ËSun is committed to investigating and addressing the privacy challenges . . . associated with our increasingly digital way of life. ââ¬â¢ In such statements, companies reduce the power distance between themselves and their stakeholders. The inclusive ââ¬Ëweââ¬â¢ is also an indicator of positive politeness (Brown & Levinson 1987), indicating how writers conceptualize their audiences and what kind of distance writers create between themselves and their audience.While some companies use the inclusive ââ¬Ëwe,ââ¬â¢ others talk about companies in general, e. g. ââ¬Ëall businesses are responsible for . . . ,ââ¬â¢ which includes themselves only implicitly and distances themselves from these events. Mostly, though, companies make themselves the causal agents: ââ¬Ëwe must address these concerns by helping to protect . . .. ââ¬â¢ Notably, one company draws its audiences into the discourse by always addressing them directl y, e. g. ââ¬ËWe understand and respect your desire to protect . . .. ââ¬â¢ All together, the different voices present in these texts suggest that companies have different levels of self-awareness and different understandings of their role in this process.Less variety exists in the distance to the audience, which is ââ¬â apart from one exception ââ¬â not explicitly present in the discourse. This suggests that companies do not consider their CSR activities to be dialogic in nature. Another kind of discourse is found in 10 of the companiesââ¬â¢ texts studied. This discourse reveals that some companies are actually interested in ? nding a balance between usersââ¬â¢ privacy interests and their own business interests rather than protecting privacy unconditionally. They seek to achieve a balance between customersââ¬â¢ privacy interests and ââ¬Ëbusiness priorities,ââ¬â¢ ââ¬Ëbusiness requirements,ââ¬â¢ ââ¬Ëbusiness needs,ââ¬â¢ their ââ¬Ëvalues,â⠬⢠or their ââ¬Ëability . . . to reap the bene? ts of online interactions. Business interests are also communicated implicitly: ââ¬Ëour goal is simple: to balance the interests and concerns of our customersââ¬â¢ private information with their interest in receiving quality service and information about useful new products. ââ¬â¢ Alternatively, one company mentions only one weight of the balance, without saying what the other weight is: ââ¬Ëthat we are striking the right balance for our customersââ¬â¢ and ââ¬Ëto reach balanced results. ââ¬â¢ The discourse of balance is a manifestation of the companiesââ¬â¢ power, given that it is they who decide when this balance is reached. Interestingly, this kind of discourse has nothing to do with the motivations they express.Two companies, for example, have voiced moral motives, but also engage in this discourse of balance, as does the one company that has indicated an instrumental motive. It is also worth noting that not a single European company in the sample engages in this discourse of balance. Discussion The literature review has highlighted that users are concerned about privacy and that companies do not respond in a manner that eases stakeholder concerns. The companies chosen for this study are all active in the hardware, software, or telecommunications industries, in which data play a crucial role. Thus, information privacy, and in particular online privacy, is a central issue in their business conduct.The content analysis has revealed that only a small proportion of the largest IT companies comprehensively address privacy as a social responsibility. In the sample, we ? nd both companies that have taken a number of relevant actions to address user privacy and companies that have only taken one or two concrete measures, but nevertheless present privacy as part of their CSR program. A substantial proportion of the measures they have taken fall into the area of compliance and employee condu ct (e. g. guidelines, policies, monitoring, and reporting), while measures that stimulate a stakeholder dialogue or represent corporate social innovation are found less frequently.Further, some companies reveal that they seek to strike a balance between their own business interests and their stakeholdersââ¬â¢ privacy needs. The sample even contains companies that 98 r 2010 The Author Business Ethics: A European Review r 2010 Blackwell Publishing Ltd. Business Ethics: A European Review Volume 20 Number 1 January 2011 voice moral motives for framing online privacy as a CSR, while at the same time indicating that they are interested in striking a balance between usersââ¬â¢ privacy interests and their own business interests. We have also seen that some of the privacy measures are actually intended to ful? ll legal responsibilities rather than ethical ones.Thus, some companies in the sample voice concerns and a commitment to help, but do not take privacy to the level of an ethical responsibility (cf. Carroll 1991). At the same time, companies load their privacy discourse with emotive terms suggesting concern, commitment, and a desire to help. While this kind of language is typical of CSR messages and can almost be expected (cf. Pollach 2003), it is still in contrast to the results of the content analysis, which has shown that comprehensive privacy programs are for the most part non-existent. The ? ndings also indicate that companies have chosen a wide variety of approaches to information privacy. In fact, many of the different measures denti? ed were taken by one, two, or three companies only. Thus, little mimicry and no institutionalized practices have emerged yet. In uncertain environments, companies have a tendency to model themselves after other companies that are more successful or more respected. This mimicry leads to institutionalized practices that help companies to obtain legitimacy (DiMaggio & Powell 1983). The environment in which the sample compan ies operate can be characterized as uncertain, as there is no comprehensive privacy legislation as yet and privacy is, to some extent, at each companyââ¬â¢s discretion. For mimicry behavior to occur, it must be clear to the ? m that adopting a certain practice brings competitive advantages (DiMaggio & Powell 1983). In the case of privacy, an institutionalization of voluntary privacy practices could mean that privacy regulation is preempted. However, as not every company in the sample, and maybe in the industry as a whole, is pro self-regulation, some companies may decide not to adopt privacy practices voluntarily, despite the fact that they care about user privacy. Privacy may be on its way to mature from the ethics/compliance focus to a more responsive, proactive focus, but at the moment, it plays a minor role as a CSR. This point is also re? ected in the ? nding that companies address primarily consumer oncerns and step up employee training, while all other stakeholder groups i n privacy play a subordinate role. Companies may not have recognized the bene? ts to be gained from engaging with secondary stakeholder groups, e. g. from cooperating with industry partners. At the same time, companies may have been too occupied with implementing privacy standards internally, so that their privacy efforts do not involve secondary stakeholders as yet. These internal compliance measures are clearly the sine qua non for a companyââ¬â¢s external privacy activities, such as participation in industry initiatives. This study is not without limitations. One clear limitation is that the data stem from corporate selfreports, which are problematic (cf.Podsakoff & Organ 1986) in that they are based on what the company reveals rather than what is actually true. This could mean that companies overstate their activities. At the same time, companies may not have mentioned the particular measures they have taken, because they did not consider them important enough. Also, the samp le size could have been larger, but the small sample size also serves to illustrate that privacy is just about to begin to play a role in CSR programs of technology-oriented companies. APPENDIX A: COMPANIES Adobe Agilent ATT Belgacom British Telecom Cisco Computer Associates Dell Deutsche Telekom Electronic Data Systems France Telecom HP IBM Microsoft Motorola Nokia Oracle IN THE SAMPLE 2010 The Author Business Ethics: A European Review r 2010 Blackwell Publishing Ltd. 99 Business Ethics: A European Review Volume 20 Number 1 January 2011 Portugal Telekom Royal KPN Sabre Sprint Sun Symantec Telefonica Telekom Austria Telia Sonera Verizon Virgin Vodafone Xerox References Aguilera, R. V. , Rupp, D. , Williams, C. A. and Ganapathi, J. 2007. ââ¬ËPutting the S back in CSR: a multilevel theory of social change in organizationsââ¬â¢. Academy of Management Review, 32:3, 836ââ¬â863. Ansoff, I. 1965. Corporate Strategy. New York, NY: McGraw-Hill. Awazu, Y. and Desouza, K. C. 2004. â â¬ËThe knowledge chiefs: CKOs, CLOs and CPOsââ¬â¢. European Management Journal, 22:3, 339ââ¬â344. Basu, K. and Palazzo, G. 2008. Corporate social responsibility: a process model of sensemakingââ¬â¢. Academy of Management Review, 33:1, 122ââ¬â136. Baumer, D. L. , Earp, J. B. and Poindexter, J. C. 2004. ââ¬ËInternet privacy law: a comparison between the United States and the European Unionââ¬â¢. Computers and Security, 23:5, 400ââ¬â412. Bowie, N. and Jamal, K. 2006. ââ¬ËPrivacy rights on the internet: self-regulation or government regulation? ââ¬â¢. Business Ethics Quarterly, 16:3, 323ââ¬â342. Brown, P. and Levinson, S. C. 1987. Politeness. Cambridge: Cambridge University Press. Burke, L. and Logsdon, J. M. 1996. ââ¬ËHow corporate social responsibility pays offââ¬â¢. Long Range Planning, 29:4, 495ââ¬â502. Carroll, A. B. 1979. A three-dimensional conceptual model of corporate performanceââ¬â¢. Academy of Management Review, 4:4, 497ââ¬â 505. Carroll, A. B. 1991. ââ¬ËThe pyramid of corporate social responsibility: toward the moral management of organizational stakeholdersââ¬â¢. Business Horizons, 34:4, 39ââ¬â48. Carroll, A. B. 1998. ââ¬ËThe four faces of corporate citizenshipââ¬â¢. Business and Society Review, 100:1, 1ââ¬â7. Caudill, E. M. and Murphy, P. E. 2000. ââ¬ËConsumer online privacy: legal and ethical issuesââ¬â¢. Journal of Public Policy and Marketing, 19:1, 7ââ¬â19. Chaudhri, V. A. 2006. ââ¬ËOrganising global CSR: a case study of Hewlett-Packardââ¬â¢s e-inclusion initiativeââ¬â¢. Journal of Corporate Citizenship, 23, 39ââ¬â51. Ciocchetti, C. A. 2007. E-commerce and information privacy: privacy policies as personal information protectorsââ¬â¢. American Business Law Journal, 44:1, 55ââ¬â126. Culnan, M. J. and Armstrong, P. K. 1999. ââ¬ËInformation privacy concerns, procedural fairness, and impersonal trust: an empirical investigationââ¬â¢. Organizatio n Science, 10:1, 104ââ¬â115. De George, R. T. 2000. ââ¬ËBusiness ethics and the challenge of the information ageââ¬â¢. Business Ethics Quarterly, 10:1, 63ââ¬â72. DiMaggio, P. J. and Powell, W. W. 1983. ââ¬ËThe iron cage revisited: the institutional isomorphism and collective rationality in organizational ? eldsââ¬â¢. American Sociological Review, 48:2, 147ââ¬â160. Fairclough, N. 2005. Critical discourse analysis, organizational discourse, and organizational changeââ¬â¢. Organization Studies, 26:6, 915ââ¬â939. Fernback, J. and Papacharissi, Z. 2007. ââ¬ËOnline privacy as legal safeguard: the relationship among consumer, online portal, and privacy policiesââ¬â¢. New Media and Society, 9:5, 715ââ¬â734. Foxman, E. R. and Kilcoyne, P. 1993. ââ¬ËInformation technology, marketing practice, and consumer privacy: ethical issuesââ¬â¢. Journal of Public Policy and Marketing, 12:1, 106ââ¬â119. Fukukawa, K. and Moon, J. 2004. ââ¬ËA Japanese m odel of corporate social responsibility? A study of website reportingââ¬â¢. Journal of Corporate Citizenship, 16, 45ââ¬â59. Han, P. and Maclaurin, A. 2002. Do consumers really care about online privacy? ââ¬â¢. Marketing Management, 11:1, 35ââ¬â38. Herring, S. C. 2004. ââ¬ËComputer-mediated discourse analysis: an approach to researching online behaviorââ¬â¢. In Barab, S. A. , Kling, R. and Gray, J. H. (Eds. ), Designing For Virtual Communities in the Service of Learning: 338ââ¬â376. New York, NY: Cambridge University Press. Introna, L. D. and Pouloudi, A. 1999. ââ¬ËPrivacy in the information age: stakeholders, interests and valuesââ¬â¢. Journal of Business Ethics, 22:1, 27ââ¬â38. 100 r 2010 The Author Business Ethics: A European Review r 2010 Blackwell Publishing Ltd. Business Ethics: A European Review Volume 20 Number 1 January 2011 Jick, T. D. 1979. Mixing qualitative and quantitative methods: triangulation in actionââ¬â¢. Administrative Science Quarterly, 24, 602ââ¬â611. Johnson, D. 2006. ââ¬ËCorporate excellence, ethics, and the role of ITââ¬â¢. Business and Society Review, 111:4, 457ââ¬â475. Jones, T. M. 1995. ââ¬ËInstrumental stakeholder theory: a synthesis of ethics and economicsââ¬â¢. Academy of Management Review, 20:2, 404ââ¬â437. Kayworth, T. , Brocato, L. and Whitten, D. 2005. ââ¬ËWhat is a chief privacy of? cer? ââ¬â¢. Communications of AIS, 16, 110ââ¬â126. Kelle, U. and Laurie, H. 1995. ââ¬ËComputer use in qualitative research and issues of validityââ¬â¢. In Kelle, U. (Ed. ), Computer-Aided Qualitative Data Analysis. Theory, Methods and Practice: 19ââ¬â28. London: Sage. Kelly, E. P. nd Rowland, H. C. 2000. ââ¬ËEthical and online privacy issues in electronic commerceââ¬â¢. Business Horizons, 43:3, 3ââ¬â12. Kolbe, R. H. and Burnett, M. S. 1991. ââ¬ËContentanalysis research: an examination of applications with directives for improving research reliability and objectivityââ¬â¢. Journal of Consumer Research, 18:2, 243ââ¬â250. Krippendorff, K. 1980. Content Analysis: An Introduction to its Methodology. Beverly Hills, CA: Sage. Lieberman, M. B. and Montgomery, D. B. 1998. ââ¬ËFirstmover (dis)advantages: retrospective and link with the resource-based viewââ¬â¢. Strategic Management Journal, 19:12, 1111ââ¬â1125. Markoff, J. , Shapiro, G. and Weitman, S. R. 1974. Toward the integration of content analysis and general methodologyââ¬â¢. In D. Heise (Ed. ), Sociological Methodology: 1ââ¬â58. San Francisco, CA: Jossey-Bass. Milne, G. R. and Culnan, M. J. 2004. ââ¬ËStrategies for reducing online privacy risks: why consumers read (or donââ¬â¢t read) online privacy noticesââ¬â¢. Journal of Interactive Marketing, 18:3, 15ââ¬â29. Mintzberg, H. 1983. ââ¬ËThe case for corporate social responsibilityââ¬â¢. Journal of Business Strategy, 4:2, 3ââ¬â15. Neate, R. 2009. ââ¬ËDeutsche Bank admits possible privacy breaches. ââ¬â¢ The Telegraph, July 23. Neuendorf, K. A. 2002. The Content Analysis Guidebook. Thousand Oaks, CA: Sage. Norberg, P. A. and Horne, D. R. 2007. ââ¬ËPrivacy attitudes and privacy-related behaviorââ¬â¢.Psychology and Marketing, 24:10, 829ââ¬â847. Norberg, P. A. , Horne, D. R. and Horne, D. A. 2007. ââ¬ËThe privacy paradox: personal information disclosure intentions versus behaviorsââ¬â¢. Journal of Consumer Affairs, 41:1, 100ââ¬â126. Oââ¬â¢Brien, K. J. 2008. ââ¬ËPrivacy laws trip up Googleââ¬â¢s expansion in parts of Europe. ââ¬â¢ New York Times, November 18. Payne, D. and Trumbach, C. C. 2009. ââ¬ËData mining: proprietary rights, people and proposalsââ¬â¢. Business Ethics: A European Review, 18:3, 241ââ¬â252. Phelps, J. , Nowak, G. and Ferrell, E. 2000. ââ¬ËPrivacy concerns and consumer willingness to provide personal informationââ¬â¢. Journal of Public Policy and Marketing, 19:1, 27ââ¬â41. Podsakoff, P.M. and Orga n, D. W. 1986. ââ¬ËSelf-reports in organizational research: problems and prospectsââ¬â¢. Journal of Management, 12:4, 531ââ¬â544. Pollach, I. 2003. Communicating Corporate Ethics on the World Wide Web: A Discourse Analysis of Selected Company Websites. Frankfurt: Peter Lang. Pollach, I. 2005. ââ¬ËA typology of communicative strategies in online privacy policies: ethics, power and informed consentââ¬â¢. Journal of Business Ethics, 62:3, 221ââ¬â235. Post, J. E. 2000. ââ¬ËMoving from geographic to virtual communities: global corporate citizenship in a dot. com worldââ¬â¢. Business and Society Review, 105:1, 27ââ¬â46. Rifon, N. J. , LaRose, R. and Choi, S. M. 2005. Your privacy is sealed: effects of web privacy seals on trust and personal disclosuresââ¬â¢. Journal of Consumer Affairs, 39:2, 339ââ¬â362. Sama, L. M. and Shoaf, V. 2002. ââ¬ËEthics on the web: applying moral decision making to the webââ¬â¢. Journal of Business Ethics, 36:1ââ¬â2 , 93ââ¬â103. Shapiro, B. and Baker, C. R. 2001. ââ¬ËInformation technology and the social construction of information privacyââ¬â¢. Journal of Accounting and Public Policy, 20:4, 295ââ¬â322. Sharfman, M. P. , Pinkston, T. S. and Sigerstad, T. D. 2000. ââ¬ËThe effects of managerial values on social issues evaluation: an empirical examinationââ¬â¢. Business and Society, 39:2, 144ââ¬â182. Sheehan, K. B. 2002. ââ¬ËToward a typology of internet users and online privacy concernsââ¬â¢.The Information Society, 18:1, 21ââ¬â32. Smith, A. D. and Rupp, W. T. 2004. ââ¬ËOnline privacy policies and diffusion theory perspectives: security or chaos? ââ¬â¢. Services Marketing Quarterly, 25:3, 53ââ¬â75. r 2010 The Author Business Ethics: A European Review r 2010 Blackwell Publishing Ltd. 101 Business Ethics: A European Review Volume 20 Number 1 January 2011 Snider, J. , Hill, R. P. and Martin, D. 2003. ââ¬ËCorporate social responsibility in the 21st centu ry: a view from the worldââ¬â¢s most successful ? rmsââ¬â¢. Journal of Business Ethics, 48:2, 175ââ¬â187. Spinello, R. A. 1998. ââ¬ËPrivacy rights in the information economyââ¬â¢. Business Ethics Quarterly, 8:4, 723ââ¬â742. Strauss, A. L. nd Corbin, J. 1990. Basics of Qualitative Research: Grounded Theory Procedures and Techniques. Newbury Park, CA: Sage. Tetrault Sirsly, C. A. and Lamertz, K. 2008. ââ¬ËWhen does a corporate social responsibility initiative provide a ? rst-mover advantage? ââ¬â¢. Business and Society, 47:3, 343ââ¬â369. Trauth, E. M. and Jessup, L. M. 2000. ââ¬ËUnderstanding computer-mediated discussions: positivist and interpretive analyses of group support system useââ¬â¢. MIS Quarterly, 24:1, 43ââ¬â79. van Dijk, T. A. 1985. ââ¬ËLevels and dimensions of discourse analysisââ¬â¢. In van Dijk, T. A. Handbook of Discourse Analysis, Vol. 2: 1ââ¬â12. London: Academic Press. van Dijk, T. A. 1997a. Discourse as interaction in societyââ¬â¢. In van Dijk, T. A. Discourse as Social Interaction: 1ââ¬â37. London: Sage. van Dijk, T. A. 1997b. ââ¬ËThe study of discourseââ¬â¢. In van Dijk, T. A. Discourse as Structure and Process, Vol. 1: 1ââ¬â34. London: Sage. Weber, R. P. 1985. Basic Content Analysis. Beverly Hills, CA: Sage. Westin, A. F. 1967. Privacy and Freedom. New York, NT: Atheneum. Wray, R. 2009. ââ¬ËT-Mobile con? rms biggest phone customer data breach. ââ¬â¢ The Guardian, November 17. Zonghao, B. 2001. ââ¬ËAn ethical discussion on the network economyââ¬â¢. Business Ethics: A European Review, 10:1, 108ââ¬â112. 102 r 2010 The Author Business Ethics: A European Review r 2010 Blackwell
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment