Saturday, May 31, 2014

Zappos Hiring: Culturally Fit? Legally Defensible?

Zappos has two separate sets of interviews. The hiring manager and his or her team will interview for the standard fit within the team, relevant experience, technical ability and so on. And then the HR department does a separate set of interviews purely for culture fit. Zappos has questions for each and every one of their ten core values. "They [applicants] need the relevant skill set and experience and so on," CEO Tony Hsieh says. "But far more important is, are they going to be good for the culture? Is this someone we would choose to have dinner or drinks with, even if they weren't working for Zappos?"

One of Zappos' core values is, "Create fun and a little weirdness." So, during an interview, a candidate gets asked, "On a scale of 1 to 10, how weird are you?"  "If you're a 1, you're probably a little bit too strait-laced for us. If you're a 10, you might be too psychotic for us," says Hsieh.  Another potential deal-breaker is saying you don’t socialize with co-workers outside the office. That could undermine another of Zappos’ core values: “Build a positive team and family spirit.”

Michael Hyter, senior partner at Korn Ferry has states, “The word ‘‘fit’’ in the absence of that support factor [fair consideration for jobs for people who happen to be different] can easily be misinterpreted as ‘‘being like me,’’ instead of what the position requires. Many organizations make the mistake of assuming that those tasked with selecting new hires are equipped to do so fairly because they are nice people or good workers. But failure to ensure the selection process is based on standard criteria with trained interviewers can result in unintentional bias in the spirit of looking for someone who’s a perceived ‘‘good fit.’’

Built-In Headwinds

As Natasha Tiku wrote in ValleyWag, "Taken at face value, "not a culture fit" sings: Nothing personal, you'd probably be happier somewhere else! But what prioritizing "fit" really allows a company to do is reject an applicant for not matching the pattern. For being an "other" in some way. For coming from a different background, looking different, acting different, having different interests, all of which should be considered an asset.

More than forty years ago, the Supreme Court ruled in Griggs v Duke Power Co. that "good intent or absence of discriminatory intent does not redeem employment procedures or testing mechanisms that operate as “built-in headwinds” for minority groups and are unrelated to measuring job capability." The Court found that some of Duke’s hiring requirements, like a high school education or a certain I.Q. level, were irrelevant to some jobs and had the effect of excluding qualified black workers. The Court held that the law "proscribes not only overt discrimination but also practices that are fair in form, but discriminatory in operation. The touchstone is business necessity. If an employment practice which operates to exclude Negroes cannot be shown to be related to job performance, the practice is prohibited."

As previously noted, failing to socialize with co-workers outside the office is a potential deal-breaker for Zappos. According to the company, that could undermine one of its core values: “Build a positive team and family spirit.”  This out-of-office socializing "deal-breaker" eliminates many persons from employment consideration, including single parents, children whose elderly parents need care, persons going to school part-time, and well-rounded persons who have interests outside the company - persons who engage in charitable activities, participate in civic functions or practice their faith.

A frequently-mentioned aspect of the Zappos culture involves the consumption of alcohol:
  • According to Business Insider, "Zappos prides itself on a fun, quirky company culture where the CEO Tony Hsieh has been known to dole out shots of Grey Goose."
  • A Zappos press release reads, "I had three vodka shots with Tony during my interview," says Rebecca Ratner, Zappos's head of human resources. "And I'm not atypical."
  • As Ms. Ratner stated, "While we rack up some pretty big bills for happy hours and parties, we believe that every one of those dollars comes back to us threefold in employee engagement, which to us is really what success is all about."
The prominent role alcohol plays in Zappos' hiring for "cultural fit" would seem to present built-in headwinds for Mormons, Baptists, Muslims, recovering alcoholics, diabetics, and persons scarred - physically or mentally - by the acts and actions of alcoholics. As noted in Zappos: The Future of Hiring and Hiring Discrimination, many of the persons experiencing those headwinds are Protected Persons under federal and state employment discrimination laws.

During an interview at Zappos, a candidate gets asked, "On a scale of 1 to 10, how weird are you?" According to Zappos' CEO, "If you're a 10, you might be too psychotic for us." In a company that has "Create Fun and a Little Weirdness" as one of its core values, the word "weird" is associated with mental illness.

Historically, many employers asked applicants to provide information concerning their physical and/or mental condition. This information often was used to exclude and otherwise discriminate against individuals with disabilities -- particularly nonvisible disabilities, such as mental illness -- despite their ability to perform the job.The Americans with Disabilities Act (ADA) prohibits all disability-related inquiries pre-job offer.

The use of tests and other selection procedures, like Zappos' cultural fit interview, can violate the ADA if an employer intentionally uses them to discriminate based on disability or if they disproportionately exclude persons with disabilities, like persons with mental illness.

EEOC guidance notes that employers should ensure that employment tests and other selection procedures are properly validated for the positions and purposes for which they are used. The test or selection procedure must be job-related and its results appropriate for the employer’s purpose. If a selection procedure screens out a protected group (i.e., race, gender, age, disability), the employer should determine whether there is an equally effective alternative selection procedure that has less adverse impact and, if so, adopt the alternative procedure.

Core Values and Transparency

Zappos established ten core values to clearly define the Zappos Family culture, values that are to be reflected in everything they do and every interaction they have. When searching for potential employees, Zappos looks for people who both understand the need for these core values and are willing to embrace and embody them. The ten core values are explained by Zappos employees in the following video:



With the exception of the CEO, there appear to be no persons of color among the seven employees interviewed in the video. It may well be that the video is not a representative sampling of employee diversity at Zappos and, if so, the company, acting in accordance with the core value “Build Open and Honest Relationships With Communication,” could publish information on its workforce diversity along the lines of Google's recent disclosure. As Google's Senior Vice President of People Operations wrote:
We've always been reluctant to publish numbers about the diversity of our workforce at Google. We now realize we were wrong, and that it’s time to be candid about the issues. Put simply, Google is not where we want to be when it comes to diversity, and it’s hard to address these kinds of challenges if you’re not prepared to discuss them openly, and with the facts. 
The same spirit of openness that leads Zappos to supply information on sales to its suppliers through its extranet should lead it to provide its workforce diversity information to its employees, applicants, and the general public - Zappos' customers. 

Monday, May 19, 2014

EEOC's Next Step in Preserving Values

The recent White House report, “Big Data: Seizing Opportunities, Preserving Values," found that, "while big data can be used for great social good, it can also be used in ways that perpetrate social harms or render outcomes that have inequitable impacts, even when discrimination is not intended." The fact sheet accompanying the White House report warns:
As more decisions about our commercial and personal lives are determined by algorithms and automated processes, we must pay careful attention that big data does not systematically disadvantage certain groups, whether inadvertently or intentionally. We must prevent new modes of discrimination that some uses of big data may enable, particularly with regard to longstanding civil rights protections in housing, employment, and credit.

In order to address the potential for big data analytics to systematically disadvantage certain groups, the White House report contains the following policy recommendation:
The federal government’s lead civil rights and consumer protection agencies, including the Department of Justice, the Federal Trade Commission, the Consumer Financial Protection Bureau, and the Equal Employment Opportunity Commission, should expand their technical expertise to be able to identify practices and outcomes facilitated by big data analytics that have a discriminatory impact on protected classes, and develop a plan for investigating and resolving violations of law in such cases. In assessing the potential concerns to address, the agencies may consider the classes of data, contexts of collection, and segments of the population that warrant particular attention, including for example genomic information or information about people with disabilities. 
Examples of Discriminatory Big Data Practices and Outcomes

Examples of practices and outcomes facilitated by big data analytics that could have a discriminatory impact on protected classes include:


EEOC's Next Step?

The White House report recommends that the EEOC, as one of the federal government's lead civil rights agencies, expand its technical expertise to be able to identify practices and outcomes facilitated by big data analytics that have a discriminatory impact on protected classes. The EEOC's next step may be to call on some of the same resources used by the White House review group led by John Podesta at the three workshops held during the 90-day review period leading to the issuance of the White House report:
The suggestion is that some of the individuals and organizations that co-hosted, presented and supported those the three workshops be persuaded to take their shows on the road, assisting the EEOC and the other federal civil rights agencies (Department of Justice, Federal Trade Commission and Consumer Financial Protection Bureau)  in understanding and preventing new modes of discrimination that some uses of big data may enable, particularly with regard to housing, employment, and credit.



Thursday, May 15, 2014

White House: Big Data's Role in Employment Discrimination

In“Big Data: Seizing Opportunities, Preserving Values," a White House review of how the government and private sector use large sets of data found that such information could be used to discriminate against Americans on issues such as employment. As noted in the review, "while big data can be used for great social good, it can also be used in ways that perpetrate social harms or render outcomes that have inequitable impacts, even when discrimination is not intended."

Clear Windshield or Rearview Mirror?


Algorithms embody a profound deference to precedent; they draw on the past to act on (and enact) the future. The apparent omniscience of big data may in truth be nothing more than misdirection. Instead of offering a clear windshield, the big data phenomenon may be more like a big rear-view mirror telling us nothing about the future.

Does this deference to precedent result in a self-reinforcing and self-perpetuating system, where individuals are burdened by a history that they are encouraged to repeat and from which they are unable to escape?

Already burdened segments of the population can become further victimized through the use of sophisticated algorithms in support of the identification, classification, segmentation, and targeting of individuals as members of analytically constructed groups. In creating these groups, the algorithms rely upon correlations that lead to viewing people as members of populations, or categories, or groups, rather than as individuals (i.e., persons who live more than X miles from an employer's location). Please see From What Distance is Discrimination Acceptable?

Just as neighborhoods can serve as a proxy for racial or ethnic identity, there are new worries that big data technologies could be used to “digitally redline” unwanted groups, either as customers, employees, tenants, or recipients of credit. A significant finding of the White House report is that big data could enable new forms of discrimination.

Correlation Does Not Equal Causation

Decisions made or affected by correlation are inherently flawed. Correlation does not equal causation. This point is made vividly by Tyler Vigen, a law student at Harvard who, in his spare time, put together a website that finds very, very high correlations - as shown below - between things that are absolutely not related.

Screenshot_2014-05-12_12.46.40
Screenshot_2014-05-12_12.46.30
Screenshot_2014-05-12_12.45.09
Each of these have correlation coefficents in excess of 0.99, serving to demonstrate the point that a strong correlation isn't nearly enough to make strong conclusions about how two phenomena are related to each other.

Shrouding Opacity In The Guise of Legitimacy

Some of the most profound challenges revealed by the White House Report concern how big data analytics may lead to disparate inequitable treatment, particularly of disadvantaged groups, or create such an opaque decision-making environment that individual autonomy is lost in an impenetrable set of algorithms.

Workforce analytic systems, designed in part to mitigate risks for employers, have become sources of material risk, both to job applicants and employers. The systems create the perception of stability through probabilistic reasoning and the experience of accuracy, reliability, and comprehensiveness through automation and presentation. But in so doing, technology systems draw  attention away from uncertainty and partiality. Please see Workforce Science: A Critical Look at Big Data and the Selection and Management of Employees.

Moreover, they shroud opacity—and the challenges for oversight that opacity presents—in the guise of legitimacy, providing the allure of shortcuts and safe harbors for actors both challenged by resource constraints and desperate for acceptable means to demonstrate compliance with legal mandates and market expectations.

Programming and mathematical idiom (e.g., correlations) can shield layers of embedded assumptions from higher level decisionmakers at an employer who are charged with meaningful oversight and can mask important concerns with a veneer of transparency.

This problem is compounded in the case of regulators outside the firm, who frequently lack the resources or vantage to peer inside buried decision processes. In recognition of this problem, the White House Report states that "[t]he federal government must pay attention to the potential for big data technologies to facilitate discrimination inconsistent with the country’s laws and values" and, as one of the six policy recommendations in the report, :
The federal government’s lead civil rights and consumer protection agencies, including the Department of Justice, the Federal Trade Commission, the Consumer Financial Protection Bureau, and the Equal Employment Opportunity Commission, should expand their technical expertise to be able to identify practices and outcomes facilitated by big data analytics that have a discriminatory impact on protected classes, and develop a plan for investigating and resolving violations of law in such cases. In assessing the potential concerns to address, the agencies may consider the classes of data, contexts of collection, and segments of the population that warrant particular attention, including for example genomic information or information about people with disabilities. 

Thursday, May 1, 2014

Big Data: Seizing Opportunities, Preserving Values (Excerpted)

A White House review of how the government and private sector use large sets of data has found that such information could be used to discriminate against Americans on issues such as employment. Findings and recommendations of the review, “Big Data: Seizing Opportunities, Preserving Values," were released today (May 1, 2014). The report is the culmination of a 90-day review by the Obama administration, spearheaded by Counselor John Podesta and including the two Cabinet members - Penny Pritzker (Secretary of Commerce) and Ernest J. Moniz (Secretary of Energy). 


Excerpts from the report relating to big data analytics and employment discrimination follow, along with their location in the report:
A significant finding of this report is that big data analytics have the potential to eclipse longstanding civil rights protections in how personal information is used in housing, credit, employment, health, education, and the marketplace. Americans’ relationship with data should expand, not diminish, their opportunities and potential.  (cover letter to President Obama)
Some of the most profound challenges revealed during this review concern how big data analytics may lead to disparate inequitable treatment, particularly of disadvantaged groups, or create such an opaque decision-making environment that individual autonomy is lost in an impenetrable set of algorithms.  (p. 10)
Regardless of technological advances, the American public retains the power to structure the policies and laws that govern the use of new technologies in a way that protects foundational values. 
Big data is changing the world. But it is not changing Americans’ belief in the value of protecting personal privacy, of ensuring fairness, or of preventing discrimination.  (p. 10)

[T]he civil rights community is concerned that such algorithmic decisions raise the specter of “redlining” in the digital economy—the potential to discriminate against the most vulnerable classes of our society under the guise of neutral algorithms.  (p. 46) 
An important conclusion of this study is that big data technologies can cause societal harms beyond damages to privacy, such as discrimination against individuals and groups. This discrimination can be the inadvertent outcome of the way big data technologies are structured and used. It can also be the result of intent to prey on vulnerable classes.  (p. 51) 
We have taken considerable steps as a society to mandate fairness in specific domains, including employment, credit, insurance, health, housing, and education. Existing legislative and regulatory protections govern how personal data can be used in each of these contexts. Though predictive algorithms are permitted to be used in certain ways, the data that goes into them and the decisions made with their assistance are subject to some degree of transparency, correction, and means of redress. For important decisions like employment, credit, and insurance, consumers have a right to learn why a decision was made against them and what information was used to make it, and to correct the underlying information if it is in error.  
Just as neighborhoods can serve as a proxy for racial or ethnic identity, there are new worries that big data technologies could be used to “digitally redline” unwanted groups, either as customers, employees, tenants, or recipients of credit. A significant finding of this report is that big data could enable new forms of discrimination and predatory practices. 
Whether big data will build greater equality for all Americans or exacerbate existing inequalities depends entirely on how its technologies are applied in the years to come, what kinds of protections are present in the law, and how the law is enforced. (p. 53)

Putting greater emphasis on a responsible use framework has many potential advantages. It shifts the responsibility from the individual, who is not well equipped to understand or contest consent notices as they are currently structured in the marketplace, to the entities that collect, maintain, and use data. Focusing on responsible use also holds data collectors and users accountable for how they manage the data and any harms it causes, (p. 56) 
An important finding of this review is that while big data can be used for great social good, it can also be used in ways that perpetrate social harms or render outcomes that have inequitable impacts, even when discrimination is not intended. Small biases have the potential to become cumulative, affecting a wide range of outcomes for certain disadvantaged groups. Society must take steps to guard against these potential harms by ensuring power is appropriately balanced between individuals and institutions, whether between citizen and government, consumer and firm, or employee and business.have inequitable impacts, even when discrimination is not intended. Small biases have the potential to become cumulative, affecting a wide range of outcomes for certain dis-advantaged groups. (pp. 58-59) 
Policy Recommendations: 
Expand Technical Expertise to Stop Discrimination. The federal government’s lead civil rights and consumer protection agencies should expand their technical expertise to be able to identify practices and outcomes facilitated by big data analytics that have a discriminatory impact on protected classes, and develop a plan for investigating and resolving violations of law. (p. 60) 
We must begin a national conversation on big data, discrimination, and civil liberties.  (p. 64) 
The federal government must pay attention to the potential for big data technologies to facilitate discrimination inconsistent with the country’s laws and values 
RECOMMENDATION: The federal government’s lead civil rights and consumer protection agencies, including the Department of Justice, the Federal Trade Commission, the Consumer Financial Protection Bureau, and the Equal Employment Opportunity Commission, should expand their technical expertise to be able to identify practices and outcomes facilitated by big data analytics that have a discriminatory impact on protected classes, and develop a plan for investigating and resolving violations of law in such cases. In assessing the potential concerns to address, the agencies may consider the classes of data, contexts of collection, and segments of the population that warrant particular attention, including for example genomic information or information about people with disabilities. (p. 65)