Photo of Susan F. Eandi

The US Supreme Court’s SFFA decision ending affirmative action in higher education continues to have ramifications for corporate America. Attacks to workplace DEI are gaining momentum with targeted challenges from a variety of angles, not the least of which are those coming from conservative advocacy groups filing lawsuits, requesting agency investigations and pursuing other complaints. Just last week, as many prepared to watch Taylor Swift’s boyfriend perform in the Super Bowl, America First Legal (a nonprofit founded by a former adviser to Donald Trump) filed an EEOC complaint against the NFL challenging the Rooney Rule, a widely used hiring practice that emanated in the NFL and is followed across corporate America. For in-house counsel, this just further emphasizes the need to continue to diligently monitor the changing DEI landscape for signals warranting targeted audits or adjustments to workplace DEI programming.

When should in-house counsel take action? Let’s start to answer that question by looking at where we are now and the escalation of events in the past 7 months.

Timeline of Recent Material Attacks on Workplace ID&E

July 2023 | Letter to Employers from 13 State AGs

Thirteen attorneys general used SFFA to support their opposition to corporate DEI programs (see letter to Fortune 100 CEOs here). In response, attorneys general from other states wrote to the same CEOs stating that SFFA “does not prohibit, or even impose new limits on, the ability of private employers to pursue diversity, equity, and inclusion.”Continue Reading Is The Risk Calculus Related To Workplace DEI Shifting For US Employers This Election Year?

Does your holiday wish list include CLE credit and a quick tutorial on what to expect in California labor and employment law next year?

Excellent!

Join us for our virtual California 2023-2024 Employment Law Update on Wednesday, December 13 @ 1PM PT.

2023 has been a year of dramatic change for California employers, but have

In 2023, we helped US employers overcome a host of new challenges across the employment law landscape. Many companies started the year with difficult cost-cutting decisions and hybrid work challenges. More recently, employers faced challenges around intense political discourse boiling over in the workplace. We’ve worked hard to keep our clients ahead of the curve on these

On October 30, 2023, President Biden issued a 63-page Executive Order to define the trajectory of artificial intelligence adoption, governance and usage within the United States government. The Executive Order outlines eight guiding principles and priorities for US federal agencies to adhere to as they adopt, govern and use AI. While safety and security are predictably high on the list, so too is a desire to make America a leader in the AI industry including AI development by the federal government. While executive orders are not a statute or regulation and do not require confirmation by Congress, they are binding and can have the force of law, usually based on existing statutory powers.

Instruction to Federal Agencies and Impact on Non-Governmental Entities

The Order directs a majority of federal agencies to address AI’s specific implications for their sectors, setting varied timelines ranging from 30 to 365 days for each applicable agency to implement specific requirements set forth in the Order.

The actions required of the federal agencies will impact non-government entities in a number of ways, because agencies will seek to impose contractual obligations to implement provisions of the Order or invoke statutory powers under the Defense Production Act for the national defense and the protection of critical infrastructure, including: (i) introducing reporting and other obligations for technology providers (both foundational model providers and IaaS providers); (ii) adding requirements for entities that work with the federal government in a contracting capacity; and (iii) influencing overall AI policy development.Continue Reading Biden’s Wide-Ranging Executive Order on Artificial Intelligence Sets Stage For Regulation, Investment, Oversight and Accountability

With special thanks to Danielle Benecke and Ben Allgrove for their contributions.

Baker McKenzie recently hosted industry leaders from Anthropic, Google Cloud and OpenAI in Palo Alto to discuss how in-house legal counsel can best reckon with the transformative power of GenAI.

Baker McKenzie partners joined the panel, sharing insights from their vantage point


Employee handbooks are at the top of employers’ key priorities.

Why? The NLRB’s recent decision in Stericycle adopted a retroactive “employee friendly” standard for workplace rules, including those often included in handbooks. In addition, the new year often rings in new laws requiring changes to workplace policies often included in handbooks. And, the US Supreme

Special thanks to presenters David Hackett, Eva-Maria Ségur-Cabanac, Sali Wissa, Peter Tomczak, Daniel De Deo and William-James Kettlewell.

ESG reporting is evolving quickly. Earlier this year the EU Corporate Sustainability Reporting Directive (CSRD) went into effect, which has broad legal implications for US companies with EU subsidiaries that meet

The current increase in market volatility and heightened regulatory scrutiny has made for a treacherous landscape for multinational employers, and we’re here to help. Join us on October 18th in our New York office to connect on cutting-edge Employment & Compensation issues with a series of panel discussions, presentations and peer roundtables discussing the

Special thanks to our Baker McKenzie speakers Danielle Benecke and Ben Allgrove, and Industry Experts Ashley Pantuliano, Associate General Counsel, OpenAI, Julian Tsisin, Global Legal & Compliance Technology, Meta, Janel Thamkul, Deputy General Counsel, Anthropic, and Suneil Thomas, Managing Counsel, Google Cloud AI.

Baker McKenzie is pleased to invite you to an

New York may soon restrict employers and employment agencies from using fully-automated decision making tools to screen job candidates or make other employment decisions that impact the compensation, benefits, work schedule, performance evaluations, or other terms of employment of employees or independent contractors. Draft Senate Bill 7623, introduced August 4, aims to limit the use of such tools and requires human oversight of certain final decisions regarding hiring, promotion, termination, disciplinary, or compensation decisions. Senate Bill 7623 also significantly regulates the use of certain workplace monitoring technologies, going beyond the notice requirements for workplace monitoring operative in New York since May 2022 and introducing data minimization and proportionality requirements that are becoming increasingly common in US state privacy laws.

While there is not yet a federal law focused on AI (the Biden administration and federal agencies have issued guidance documents on AI use and are actively studying the issue), a number of cities and states have introduced bills or resolutions relating to AI in the workplace. These state and local efforts are all at different stages of the legislative process, with some paving the path for others. For example, New York City’s Local Law 144 took effect on July 5, prohibiting employers and employment agencies from using certain automated employment decision tools unless the tools have undergone a bias audit within one year of the use of the tools, information about the bias audit is publicly available, and certain notices have been provided to employees or job candidates (read more here).

If enacted, Senate Bill 7623 would take things much further. Here are some of the most significant implications of the draft legislation:Continue Reading Check Yourself Before You Wreck Yourself: New York and Other States Have Big Plans For Employer Use of AI and Other Workplace Monitoring Tools