Cyber Scene - 2024: Taking It From the Top
By krahal
As we launch 2024 with overflowing cyber concerns, it appears that the whole of government, foreign and domestic, is working hard on some of the most challenging technological challenges for today and the future. Since money continues to make the world go 'round, this Cyber Scene will start, from the White House, with official approval of the difficult House and Senate work for the National Defense Authorization Act (NDAA) for Fiscal 2024.
Capitol Hill representatives and senators found common ground and the NDAA's 3,000+ pages were signed, sealed, and delivered to the White House on 12 December and signed by President Biden on 22 December. Money to execute is still in arrears, so to speak, as the fiscal year started on 1 October 2023. This left the Department of Defense (DoD), the US tech world, and government workers in a bind. However, $886 billion is a sizeable amount of funding to consider, and cyber seeps into uncountable corners of this NDAA confirmation.
House Armed Services Committee Chairman Mike Rogers (R-AL) confirmed that "Enacting the NDAA has never been more vital than today, and our allies face unprecedented and rapidly evolving threats from China, Russia, Iran, North Korea and terrorist organizations throughout the world," as reported by Defense News' Bryant Harris.
Returning to 22 December 2023, the President did thank Congress particularly for the extension of Title VII—the Foreign Intelligence Surveillance Act (FISA)--as he "looks forward to working with the congress on the reauthorization of this vital national security authority as soon as possible in the new year."
This readership may wonder what the NDAA impact might be. Steve Blank from c4isrnet gives us some perspective. He avers that DoD is "getting its innovation act together, but more can be done." Blank cites that "today, innovation in doctrine and concepts is driven by four major external upheavals that simultaneously threaten our military and economic advantage." He also noted that the Defense Innovation Unit will report to the Secretary of Defense. Deputy Defense Secretary Kathleen Hicks predicts that change is in the air and that she is "…building a coalition of the willing to get it done." Her concern is how fast this can move. The impact may include "…rapid technological change in artificial intelligence, autonomy, cyber, space, biotechnology, semiconductors, hypersonics, etc., with many driven by commercial companies in the US and China."
The Supreme Court's Chief Justice John Roberts is our Cyber Scene spokesperson regarding what the Court expects to address, and particularly, his own long interest in "the intersection of law and technology," as reported by the Times' Adam Liptak . As an expert on adjudication, the Chief Justice is looking at both AI's promise and danger, according to the Court's annual year-end report. He remarks: "At least at present, studies show a persistent public perception of a human-A.I. fairness gap, reflecting the view that human adjudications, for all of their flaws, are fairer than whatever the machine spits out."
He goes on to say: "Law professors report with both awe and angst that A.I. apparently can earn Bs on law school assignments and even pass the bar exam," he wrote. "Legal research may soon be unimaginable without it. A.I. obviously has great potential to dramatically increase access to key information for lawyers and nonlawyers alike. But just as obviously it risks invading privacy interests and dehumanizing the law." He closes by noting that both caution and humility are requirements for anyone's use of AI.
While SCOTUS and the DOJ work on the national and international issues, the Wall Street Journal reports that two California Congressmen, Mike Thompson (D-CA) and John Garamendi (D-CA), have engaged the Department of Treasury's Committee on Foreign Investment of the US (CFIUS) to investigate the investment of billions of dollars of land purchased for technology, abutting and nearly surrounding Travis Air Force Base in California. This is in Rep. Thompson's constituency and abutting that of Rep. Garamendi. The problem is the foreign origin of the funding, and of the two foreign-born "tech titans" in California. The WSJ diagram of the area speaks volumes.
DOJ officials are nearing the end of investigating Apple regarding how, through hardware and software changes, it has made it difficult for its consumers to leave Apple and its iPhones. The Times' David McCabe and Tripp Mickle cast this as "…the most consequential federal antitrust lawsuit challenging Apple, which is the most valuable tech company in the world." If DOJ succeeds, it would be the fourth of the biggest tech companies (twice for Google) for antitrust illegalities, with the Federal Trade Commission (FTC) having sued Amazon and Meta for "stifling competition."
Foreign Affairs' end of year edition includes Michèle Flournoy's analysis of "AI is Already at War." She is the former Under Secretary of Defense for Strategy (2009-2012), and now looks at the power of human creativity and commercial gain as seen with US and China. She notes that "China's ability to use cyber and electronic warfare against US networks and critical infrastructure would also be dangerously enhanced. Put simply, the Pentagon needs to accelerate…its adoption of responsible AI." She believes that the US government is still struggling involving innovation (re: technologies) and speed (re: AI). They need more professionals, more funding, more testing and evaluation processes and platforms to integrate AI into military systems safely and securely. She notes that the Hill leaders are paying attention to this, and DoD has issued a policy framework for AI to expedite responsible and safe DoD adoption. What is in the mix is advancing AI while adding guardrails.
She criticizes Congress for not providing the Pentagon's Chief Digital and Artificial Intelligence office with the resources needed to adopt AI to DoD. She cites some progress in transforming US security, such as the USAF using AI to predict the impact of a single decision to reshape program and budget, and adaptations for USAF F-35's. She also underlines the importance of predictive AI, which could provide "better understanding of what its potential adversaries might be thinking." She also discusses advantages to the intelligence community, which is already engaged, and support to military operations in several ways. And she notes that the Pentagon will be in competition with private sector AI candidates.
She concludes by noting that US policymakers understand the paradox of going too slow or going to fast regarding AI support, and that "…both Democratic and Republican policymakers know that some regulation and oversight is essential to ensuring that AI adoption is safe and responsible." She closes by noting that both the House and Senate are working on this, but that speed and safety are not balanced, with "…Americans' risk being caught in a world of both spiraling AI dangers and declining US power and influence."
As if in response to Flournoy's concern, the 3 January 2024 Foreign Affairs' "Artificial Intelligence's Threat to Democracy" by Jen Easterly, Scott Schwab, and Cait Conley, addresses how to counter AI-originated misinformation and cyberattacks regarding US elections. They should know: Jen Easterly is the Cybersecurity and Infrastructure Security Agency (CISA) Director, Cait Conley CISA's Senior Election Security Advisor, and Scott Schwab Kansas Secretary of State and Chief Election Official.
The article underscores the hugely disruptive side of AI, and how it will "…amplify cybersecurity risks and make it easier, faster, and cheaper to flood the country with fake content." A large part of protecting the upcoming elections will fall to the country's state and local officials. They are responsible, serious protectors. "For nearly 250 years, these 0fficials have protected the electoral process from foreign adversaries, wars, natural disasters, pandemics and disruptive technologies." But the looming challenges will put at stake "…nothing less than the foundation of American democracy." They cite the Office of the Director of National Intelligence's December 2023 statement regarding that in comparison with the 2018 (presidential) and 2022 (midterm) elections, "The involvement of more foreign actors (in 2022) probably reflects shifting geopolitical risk calculus, perceptions that election influence activity has been normalized (and) the low cost but potentially high reward for such activities." AI will add to cheapen activities and increase effectiveness. Moreover, "AI enabled translation services, account creation tools, and data aggregation will allow bad actors to automate their processes and target individuals and organizations more precisely and at scale." The UK's elections, which precede those in the US will be the so-called canary in the election mine.
As these challenges move forward, the Economist's Ludwig Siegele (The World Ahead 2024) underscores another issue: the distinction between national control and the absolute need of the creation of an international organization to control AI. He notes that "National laws might be able to deal with simpler AI applications and LLMs (Large Language Models), but frontier models may require global rules." Siegele offers examples of how to do so by echoing back to three international bodies which started small, grew, expanded, and serve as exemplars of how to go forward. They are the International Civil Aviation Organization (ICA0 IN 1944), the European Organization for Nuclear Research (CERN 1952), and the International Atomic Energy Agency (IAEA), the latter operating in a hotel in Vienna until 1979. Siegele goes on to note that Microsoft advocates the ICAO format, AI researchers prefer CERN, and OpenAI prefers IAEA. He believes that drawing from the creation, organization, and success of these stalwart entities point to what some are now calling. an "International Panel on AI Safety." So far, he notes that Ursula von der Leyen, the European Commission President, as well as a group of tech executives, have endorsed this direction.
All technologies, great and small, are playing out next door and around the world. The innumerable AI and attendant issues are making defense and offense, and the collaboration required, a mighty challenge at home and abroad.
To see previous articles, please visit the Cyber Scene Archive.