Methodological Appendix

From ethnographic descriptions of a single office to a longitudinal survey of activists across hundreds of organizations, there are a wide range of options for investigating questions like the ones posed in this book. Before settling on the mixed-methods approach described in this appendix, I initially considered an especially detailed case study of one or two organizations, perhaps using ethnographic methods. While such an ethnographic study would have a great deal of value, I quickly rejected this possibility as it seemed incapable of thoroughly examining the linkages and fractures between the many organizations serving the disparate constituencies under the Middle Eastern identity umbrella.

Interviews

The initial sample of interview respondents was obtained by simply calling, writing, or visiting the organizations’ offices. After meeting with the first wave of contacts, I pursued a purposive snowball sampling method (that is, I asked to be introduced to other respondents at the end of each interview). In a very short time, this provided an extensive list of further interview contacts. I did not limit the pool of potential interview respondents according to their institutional affiliation, but instead took an approach where I would speak to anyone affiliated with any advocacy organization. This led to a sample that spans many more organizations than the six specified for focused analysis. This technique allowed for a more complete picture of connections (and disconnections) between organizations across the field, while also developing a very adequate picture of the history of the six largest organizations that were the original focus of my research. A full list of interview subjects appears in table A.1.

In preparation for each interview, I constructed a customized list of topics to bring up during the course of the conversation based on what I already knew about the individual and her or his organization. These in-depth and qualitative semi-structured interviews gave respondents the ability to direct the course of the conversation. Even though these interviews involved a basic level of trust, it is possible that some respondents withheld information or misremembered events. There are ongoing debates among qualitative methodologists over how interview data can best be used as a source for empirical data. To support each of the conclusions I make in this book, I have more than one source of data. I attempted, where possible, to cross-reference between an interview and my content analysis. I note in the text when I rely on only one source.

Four of my respondents worked with the government (working directly on Middle Eastern American civil rights issues). In addition to the advocates, two were members of the clergy (one was a Christian priest and one was a Muslim imam). Eight interviews were with volunteers and community organizers who did not want to be affiliated with any one advocacy organization. Most interviews were conducted in person, either in the respondent’s office or at a restaurant, café, or other similar location. Several interviews were conducted by telephone. Most of the interviews were audio recorded and later transcribed. For the interviews where audio recording was not possible, I took notes during, or immediately after, the interview. After collecting notes and creating transcripts, I looked for common themes in the interviews through a systematic qualitative analysis process.

I was aware of my role as an outsider to the communities directly served by these advocates. I worried about being perceived as not only naïve and untrustworthy, but maybe even dangerous as well. Fortunately, these concerns proved to be unfounded. The vast majority of my respondents were quite eager to share their insights and opinions. Many, especially leading advocates and those who had worked in the advocacy field for a long time, were accustomed to giving interviews to researchers and journalists. A few had even met with American presidents and other heads of state, so they were hardly intimidated by my friendly (if persistent) requests to take up some of their time. In fact, two of my respondents revealed that they had been under active law enforcement surveillance in the past (one had actually seen his FBI dossier, years after it had been created). Both of them told me that they were knowledgeable about government espionage and knew right away that I was not an imposter. On a few occasions, the advocate asked to see my academic bona fides before agreeing to speak with me. I was once asked to provide proof of my academic status, and a bit of conversation and the presentation of my business card sufficed. Some respondents asked me about my personal beliefs, my political views, and my motivations for this research before I began asking questions of them. In all cases, I introduced myself with my academic affiliation to assure my respondents that I was a legitimate researcher, and I gave everyone my contact information in case of any concerns or questions after our meeting. This project, which involved human subjects, obtained an exemption from the University of California, Santa Barbara Institutional Review Board. The proposal number for this project was 20080322. At Dickinson College, this project received an expedited review from the Institutional Review Board, and it was approved. The protocol ID number was 97.

Table A.1. List of Interviews
Number Organization Date (as referenced in text) Technical notes Location of interviewee

1

MSA

April 2005a

In person

California

2

None

June 2005a

In person

Michigan

3

Palestine Office

June 2005b

In person

Michigan

4

Arab American and Chaldean Council (ACC)

July 2005a

In person

Michigan

5

ACC

July 2005b

In person

Michigan

6

ADC

July 2005c

In person

Michigan

7

Mosque (clergy)

August 2005a

In person

Michigan

8

None

August 2005b

In person

Michigan

9

MSA

August 2005c

In person

Michigan

10

ADC

August 2005d

In person

Michigan

11

Church (clergy)

August 2005e

In person

Michigan

12

ACCESS

August 2005f

In person

Michigan

13

CAIR

December 2007

In person

California

14

CAIR

January 2008d

In person, not recorded

District of Columbia (DC)

15

CAIR

January 2008a

In person

DC

16

ADC

January 2008c

In person

DC

17

MPAC

January 2008b

In person

DC

18

ISNA

January 2008e

In person

DC

19

MPAC

February 2008f

In person

DC

20

ADC

February 2008b

In person

DC

21

SAALT

February 2008d

Phone

DC

22

ADC

February 2008a

In person

DC

23

SALDEF

February 2008e

In person

DC

24

AAI

February 2008c

In person

DC

25

National Association of Muslim Lawyers (NAML)

February 2008g

Phone

DC

26

United Sikhs

February 2008h

In person

DC

27

ADC

March 2008a

In person

DC

28

ADC

March 2008b

In person

DC

29

CAIR–Chicago

March 2008c

Phone

Illinois

30

NAML

March 2008d

In person

DC

31

SALDEF

April 2008a

In person

DC

32

None

April 2008b

In person

New York

33

None

April 2008c

In person

DC

34

Karamah

May 2008c

Phone

DC

35

AAI

May 2008a

In person

DC

36

American Task Force on Palestine

May 2008b

In person

DC

37

None

June 2008g

Phone

DC

38

CAIR–Chicago

June 2008e

Phone

Illinois

39

RWG

June 2008c

In person

DC

40

RWG

June 2008d

In person

DC

41

SAALT

June 2008b

Phone

DC

42

CAIR–Chicago

June 2008f

In person, not recorded

DC

43

ACCESS

June 2008i

Phone

Michigan

44

ICIRR

June 2008g

Phone

Illinois

45

ADC

June 2008j

In person, not recorded

DC

46

AAI

June 2008a

In person, not recorded

DC

47

NNAAC

July 2008a

Phone

Michigan

48

MPAC

September 2008a

Phone

DC

49

CAIR

October 2008a

Phone

California

50

Government official

February 2009a

Phone

Maryland

51

CAIR

July 2009b

Phone

Minnesota

52

ADC

July 2009c

Phone

DC—Follow up interview

53

DHS Official

July 2009a

In person

DC

54

ADC

August 2009b

In person

Michigan

55

CAIR

August 2009a

In person

Michigan

56

LCCR

August 2009c

In person, not recorded

Michigan

57

SAALT

October 2010a

In person, not recorded

DC—Follow up interview

58

SAALT

October 2010b

In person, not recorded

DC—Follow up interview

59

Federal official (agency withheld)

February 2012a

In person

DC

60

None

October 2012a

In person, not recorded

DC

61

RWG

June 2013a

In person

DC—Follow up interview

62

CAIR

June 2013b

In person, not recorded

DC

63

DHS

October 2013a

In person, not recorded

DC—Follow up interview

64

DOJ

October 2013b

In person

DC

65

ADC

December 2013a

In person, not recorded

Michigan

66

ADC

June 2014a

In person, not recorded

DC

67

Government official

November 2014a

In person, not recorded

DC—Follow up interview

68

None

December 2014a

Phone, not recorded

DC—Follow up interview

69

NNAAC

March 2015a

In person, not recorded

DC

70

AAI

September 2015a

Phone, not recorded

DC—Follow up interview

Content Analysis

The sample of documents in the analysis was obtained in three ways. First, I downloaded all information from several organizations’ publicly available websites including all linked documents in several electronic formats multiple times between 2008 and 2015 using desktop web archival software. The software package, WinHTTTrack, followed hyperlinks from the organizations’ homepages and downloaded all linked documents to my computer. Second, I visited the headquarters of several organizations between 2007 and 2010, and I made digital copies of documents with a portable electronic scanner. Finally, I registered for the email newsletters of several organizations at various points between 2005 and 2013, and some of the newsletters delivered to me were added to the sample of documents. In a few instances, specific collections of documents were sent to me by officials at the organizations. In all, I obtained more than ten thousand documents. Even this number is not a representative sample of all of the work done by the organizations in the study, but they nevertheless provide a relevant sample describing the public-facing efforts of these advocacy organizations.

To analyze the documents, I used optical character recognition software to enable computer-assisted keyword searches. Each keyword search produced a series of sentences containing my search terms. I read through these sentences, coding relevant “hits” into several different categories. These coded sentences became a pool of information describing organizational activity. When discussing the results of my analysis in the book, I provide a citation to each document as published by the organizations. The process of compiling the documents informs my analysis in other ways as well.

Database

My goal was to construct a database that contained information about all active Middle Eastern American advocacy organizations operating at any point between 1980 and 2010. To obtain data for this project, I first developed a list of keywords to use to search the index of the Encyclopedia of Associations. This list of keywords was developed over several iterations, after I discovered archaic words (e.g., “Moslem” for “Muslim”) in use during the 1980 edition. The final list of keywords, used to search both the 1980 and 2004 editions, is found in the list below.

After searching the index, and obtaining a list of organizations and the location of their entries, I then copied the data from each entry into a Microsoft Access database. I did not code or otherwise alter the information as it was presented in the Encyclopedia for this initial step. This process produced information on 159 organizations surveyed in the 1980 edition of the Encyclopedia and 382 organizations in the 2004 edition. Later, additional data on organizational finances was added from the Guidestar service to supplement the 2004 database.

List of Keywords Used to Search the Encyclopedia of Associations