This is where the alert message will go, keep it short, bb.
Read More →

Data Driven

Jarrad Jinks takes a deep dive into artificial intelligence with Christian Brown ’12

“Hey Siri, schedule an interview with Christian Brown at 5:30 Pacific time for The Ambassador.” As Siri catches the tail end of my request, I turn my attention back to Gmail. Christian Brown ’12 had previously agreed to speak with me about his professional role in data science and Artificial Intelligence (AI). As I continue to type, I tab to auto- complete the common phrases I’m apt to use in similar contexts. A notoriously poor speller, I revisit a couple of red underlines before sending. AI in 2020 is pervasive. So pervasive, in fact, that even the age-old cornerstones of human interaction—written and spoken language—are increasingly impacted by its advent. Siri adds Brown to my calendar, adjusting for my current time zone. The moment I hit enter on the keyboard, Google servers set forth a split- second series of events that land my message in Brown’s inbox, before which time his own email servers use an artificially intelligent system to read not just the content of my email, but a slew of additional metadata to assign my correspondence a spam score. It passed. 

What is AI? The terms “Artificial Intelligence” and “algorithm” increasingly appear in conversations and headlines. But conveying an understanding of AI proves much more difficult than reporting the significance of its use. 

Artificial Intelligence is an overarching designation given to any computer system intended to mimic intelligent human reason. It encompasses disciplines such as natural language processing, the manipulation of human language for applications such as autocomplete, translation, and smart assistants; computer vision which allows now-common photo album features such as facial-recognition; and machine learning, where it excels at extracting patterns from large datasets, learning from that data without human intervention, and applying that learning. These and other subsets of AI often overlap to power not just consumer-level life-enhancements, but technologies that drive processes impacting our lives in a much more significant way, influencing enterprise decision-making, global financial markets, the course of academic research, and government policies. 

With such potential at the fingertips of the knowledgeable few who program these intelligences, perhaps Brown can be considered quite a powerful person. As a senior data scientist with enterprise AI software developer,, Brown and his colleagues work with some of the world's largest and most influential organizations to understand their data and put it to use for predictive analytics, fraud detection, intelligent management, inventory optimization and a limitless number of practical applications. He touches industries from energy and telecommunications to aerospace and even defense, developing needs-based solutions using the wealth of data those organizations collect. 

Data science, especially for the development of AI, requires a strong foundation in more than one STEM discipline—math and technology particularly. Brown notes that he’s always had an interest in STEM and identifies one moment in particular that set him on the pathway towards his current industry, “I guess this started, probably back in high school at ASIJ.” Saying that as a junior, “I took an environmental science class that I really liked and that basically sparked this interest for me in energy and environment as a topic.” 

Brown, who was born in Japan and attended ASIJ from Kindergarten, fondly recalls memories of STEM at school, noting inspiring teachers such as Jane Maczuzak (FF ’09–’13) in the science department and Kristi Hoskins (FF ’99–’19) in mathematics. The computer science landscape has changed drastically in the short time since Brown left ASIJ. In high school alone our computer science course offerings have more than doubled to include robotics, as well as classes on algorithms, data structures and even data science specifically. A clear indicator of the start of this shift towards a more comprehensive and technology-focused curriculum, ASIJ began the 1-1 program the August of Brown’s senior year—requiring each student to have a laptop. 

Teacher Jake Stephens works with a student during a data science class in ASIJ’s Creative ArtsDesign Center

Today, technology is embedded throughout the curriculum at all levels and teachers introduce tech to students as early as Kindergarten. Early on, the focus is on digital citizenship and appropriate usage followed by introductions to basic circuitry and even relaying instructions to simple robots. Our middle school students are introduced to contemporary programming languages from sixth grade, when they have the opportunity to take a design technology course that walks them through development of a digital pet using Javascript. They’re introduced to further opportunities in seventh grade, when they can program VEX robots, and eighth grade when they have an option to take an AppLab in Javascript or continue on with design technology. These curricular courses are complemented by the Computer Science Discoveries, Mob Programming and Robotics clubs. 

Computer Science course offerings in high school continue to expand and include options that introduce students to programming, continue that learning with intermediate courses, and apply their skills across a variety of practical scenarios— including those focused on data science and robotics. Our Introduction and Intermediate programming courses have no prerequisite and are focused on developing student skills in Python—the most commonly used language in data science. Students may then proceed to take AP Computer Science, using Java. Recently introduced Data Science and Algorithms & Data Structures courses are also in Python. 

Brown held onto his passion for environmental science matriculating to Stanford University, intent on continuing his education with a focus on energy and environment. “I did this program called Energy Resources Engineering, which is basically an engineering perspective and approach to energy problems, whether it's oil and gas-related or whether it's renewable energy systems, power systems, utilities, power grid. Over the course of that study, I was exposed to different types of problems.” Brown describes the breadth of challenges the program exposed him to and his particular interest in one particular question—how do you efficiently operate a power grid? “As you can imagine, there's some structure to the power grid, but there's also a lot of data describing how the power grid is operating. That led me towards an interest in optimization and, broadly speaking, operations research as a topic—which is basically a study of optimal decision making and prediction. How do you make decisions given data, given some understanding of the world or the system you’re trying to model?” 

Although he continued to approach these particular questions from the perspective of energy and environment, Brown eventually realized that these challenges actually apply broadly, to many industries. “As I broadened my view of the types of problems you can solve, I came across other topics that were more closely related to what we call traditional data science in 2020, like machine learning, deep learning, statistics...I wanted to become a well-rounded professional in that area.” Brown went on to earn both his bachelor’s and master’s in the energy resources engineering program. He feels fortunate to have attended a school with a robust curriculum in computer science, mathematics, and data science that allowed him to focus on his passion for energy during his undergrad and work his way into broader topics, not necessarily unique to energy, during his graduate learning. “When I was looking for jobs out of grad school, I contemplated a career purely in energy. I was also looking at careers in data science and the reason I joined is because it sort of marries the two things. It was kind of like a technology and data science company that certainly was involved in the energy industry and could certainly benefit from having that experience in energy. But they looked at industry a little more broadly and holistically, and I was looking to dip my toes in that...Maybe I'll find my way back into energy one day.” 

Until then, Brown feels challenged by and fulfilled in his work with He describes their complex mission at the highest level, “it's a software platform that basically makes it easier to develop and deploy AI and big data software applications, at scale, across an enterprise.” As with AI at the consumer level, the most practical way to understand what enterprise AI looks like is through an example. Brown illustrates, hypothetically, “A classic example might be, at a company such as Royal Dutch Shell—they operate, several hundred offshore oil rigs and they have sensor data for all these offshore oil rigs. I'm interested in having a unified view of all this data. I want to be able to access it easily, make sure it's consistent, the quality is good, and I want to be able to leverage it specifically with respect to some application. An application might be to use sensor data to predict when the compressor valves are going to fail on the old rigs.” He describes that type of predictive analysis as both a technically and methodologically difficult task. “It's not easy to have software do that, so the whole premise of the company is to support that type of use-case.” 

A nearly 500-person strong company, employs Brown as part of a global team of about 50 data scientists who scope projects, wrangle data, and develop the respective AI. Brown and his teammates’ responsibilities are not limited to the technical data exploration and coding, but also encompass collaboration and outreach, “We specialize in hearing what business problems the customer is interested in solving and translating that into a tangible data science task that is well- defined, approachable, and solvable.” Brown details his role, “I'm often involved in the end-to-end life cycles of different types of engagements with customers. I help scope the type of work, see if there's a business problem that the customer is potentially interested in solving, whether our technology could help them with that and then I’m also responsible for designing and developing or translating their business into a data science task—a task that we are going to answer by data and the associated technique.” 

Brown reminds me that AI is not just an enterprise tool to churn profit or a consumer commodity to help us choose our next Netflix binge. He sees the technology used in health and medicine, the nonprofit sector, and clean energy—areas with the most significant impacts on quality of life and sustainability. But as with any powerful technology, questions of ethics and misuse soberly balance the pros with the potential cons. Brown perks up when asked about how AI can be misappropriated. “I’ll tell you about something that's been on my mind a lot recently...have you heard of deepfakes?” Deepfakes are a form of synthetic media built upon an existing image or video in which a person’s likeness is altered so that they appear to be someone else. The process primarily falls under the AI umbrella of “computer vision,” though extends not only to still and moving images, but audio and voice alterations as well. Relatively convincing examples of these doctored videos can be found on YouTube and the technology is only improving. 

Deepfakes are an especially apropos example of AI misuse in a time when “fake news” is at the forefront of recent election cycles and organizations such as Cambridge Analytica have demonstrated the influence AI can have on democratic processes. “I think that's super dangerous from the misinformation perspective, and I think and I worry about it a lot because I don't really see a meaningful way to control it.” Brown refers to the unprecedented accessibility to information, learning materials, and open-source software that makes some AI technologies, at least at a low level, obtainable and usable for those wishing to learn. The barrier-to-entry is minimal for both good and bad actors. Though, as Brown reflects, to affect the greatest change, you need some serious resources. He makes an example of something that seems outwardly trivial—AlphaGo. 

South Korean professional Go player Lee Sedol, right, plays in the Google DeepMind challenge in 2016.(DeepMind)

AlphaGo is a program designed by DeepMind Technologies (now owned by Google), to play the board game Go. In March 2016 it became the first computer to beat a nine-dan professional Go player, four games to one. Since then, AlphaGo has been succeeded by two, more advanced programs, AlphaGoZero—currently the world’s top Go player—and AlphaZero which also plays chess and shogi. AlphaGo made headlines for cracking the complexities of Go at least ten years ahead of predictions. At the time, typical AI’s would test all possible moves at any particular point in a game and chose the strongest option. Go, however, has far too many potential moves in any given turn to efficiently compute, so AlphaGo utilized a new method involving what are called deep neural networks and reinforcement learning to learn the game over the course of millions of plays—first against humans then against itself. AlphaGoZero improved upon the formula by removing the human equation—playing only against itself and learning strategies without the limitation of human influence. But, as Brown reminds us, “it costs $40 million of compute to train. It had to play literally millions of games. But that is entirely impractical and out of reach for the vast majority of people.” 

AlphaGo took part in the 2017 Future of Go Summit inChina, the birthplace of Go. Designed to help unearth even more strategic moves, the summit included a match with the world’s number one player Ke Jie seen here with Demis Hassabis, co-founder of DeepMind.(DeepMind)

A fully trained model of any AI program can be made available for public use, should the creators who fronted the compute and development costs choose. In an industry grounded in academia and research, the idyll of open source information can come with potential negative consequences. This poses one of many problems that regulators face in addressing these technologies, complicated by the speed at which they progress. As a result of these regulatory challenges, some of the industry’s largest names have taken to self-regulation. Developers, such as those at Elon Musk’s OpenAI, decided not to make their latest natural language processing model available to the public. One OpenAI project, GPT-n, aims to generate any amount of human-like text, given a prompt. Due to concerns over malicious applications of their technology, OpenAI only publicly released an untrained version of their second generation model, GPT-2, and only opened a partially trained version to researchers. Similarly, OpenAI recognizes that their third-generation model, GPT-3, is vulnerable to misuse should they make it available though open source. As such, they’ve completely limited access to approved customers, use cases, and through a channel that they can carefully control. 

See what we've been up to.

Brown outlines an additional concern, “the second thing that comes to mind for me is, with respect to the learning, is that they're very powerful, but they're very difficult to interpret. The earliest precursors to machine learning, for example, is basically a set of rules, and a set of rules is a very easy thing to interpret and you can follow exactly the decision that a model or an algorithm is making to reach its final decision.” As these models become more and more complex and powerful, they also become much more difficult to understand. “If I have an algorithm that's outputting predictions that are of major consequence, maybe it's the type of drug I'm going to give somebody and they're going to live or die if I give them wrong one, and a doctor looks at the outputs and they can’t understand why the model is suggesting this—to blindly trust an algorithm is a dangerous thing.” 

When asked where he feels we currently stand in terms of AI being a force-for-good, Brown says “I think we're figuring it out honestly and there's some really promising applications. I think automation is an area where it is proven to be effective.” 

He carefully notes the difference between automation that complements and empowers human productivity and the automation that destroys livelihoods, citing the time-consuming task of dredging through mountains of email spam—a nearly non-existent task thanks to AI. “I think medicine is an interesting area as well. There's been examples of these models showing success and very quickly identifying breast cancer, for example. If the doctors can spend less time diagnosing breast cancer and more time addressing it, I would say that's a good thing.” 

The Oxford Internet Institute in collaboration with Google further notes that “AI is proving crucial to advancing science and tackling some of our biggest global challenges.” They specifically identify examples such as apps that cross-reference databases to help farmers pinpoint issues with their crops, and programs that help airlines both maximize cargo space as well as find more efficient flight paths, reducing carbon emissions. Similarly, they identify AI systems dedicated to monitoring climate change through “vast and constantly evolving datasets,” that help model glacier melt, predict sea rise, and identify new chemical structures necessary to create more efficient solar cells, among other uses. 

Brown is equally optimistic of the future of data science and AI. “I think it will grow and I think, in a sense, the reason it's going to grow is because we're living in a world that's increasingly digitized. And what digitizing means is there’s going to be more and more data that you can leverage. You don't need to work at, like a Google or something, where you have billions of people's data coming in every second, every day, to do something useful.” He paints a picture of a field that is approachable at every level and reiterates that the desire to take advantage of data is common to every industry. “It feels so empowering to be able to leverage that and I hope to see at least a basic skill set or competency and understanding around data, and the taking advantage of data to make its way into all sorts of different professions.” He speculates that the computational expense currently required to perform deep learning to the extent of programs such as AlphaGo may someday be irrelevant, “What if there's a way to make computation way cheaper than it is today? Like if quantum computing is accessible and cheap and legitimate you can better enable these types of tasks.” 

Short of advancements in hardware, an alternative pathway is to make the algorithms themselves more efficient. “How do you teach an algorithm to drive a car without it needing to watch two-hundred thousand years worth of car driving footage to understand what it means to drive a car? Humans don't need that.” By comparison to computers, people have an amazing ability to see a picture of a cat a single time and learn how to identify cats for a lifetime, while a computer, as we are currently capable of developing them, would need to see a million pictures of things that are cats and are not cats. Answers to these questions could further catalyze the advancement of artificial intelligent technologies. 

Leap-and-bounds notwithstanding, artificial intelligence will continue to improve. For the potential AI and data science poses to remain a force for good, and for us to advance it to shape our lives and our world in a positive light, we need passionate people, interested in the associated industries to make a difference. “Data science is such a nascent field. Honestly, people weren't calling themselves data scientists, like ten years ago but people have been doing similar work for a long time. Now it is sort of this umbrella term, so there's a whole spectrum of types of activities you could be doing as a data scientist.” He details what he refers to as “different flavors of the work,” continuing, “are you good at navigating through data or performing data monitoring activities? Or, do you have a keen eye to the catch patterns in the data and tie it directly to your business? Another flavor of the work would be a literal PhD postdoc scientist studying math as it pertains to data science and computing. There's also this third dimension, which is like engineering. There's a role called ‘machine learning engineering,’ where strong software developers develop the infrastructure to support a lot of these cutting edge research or techniques coming out from academia or industry. There's a lot of different ways to approach it. It's a very broad field.” 

While an individual's particular interests or approach to data science may differ greatly in such a broad field, Brown also points out a number of foundational skills that make breaking into data science a more approachable goal, “At a high level, in my personal experience, what's been particularly helpful is the ability to just code and be comfortable with data and navigate it is pretty critical. No matter what area you go into. Having a strong statistical background is useful as well and, by extension, linear algebra and the marrying of those two subjects. That's kind of the foundation needed to really understand machine learning, and a set of algorithms to kind of facilitate that.” 

Diverging for a moment from the technical aspects of the profession, Brown goes on to say that data is nothing without the ability to understand and translate it into something that is valuable, a skill that he speculates requires a broad perspective and the ability to look at problems holistically and to identify trends and patterns. Fortunately for those looking to make the jump, both the technical skills and holistic approaches to data can be learned. “There's so many great resources online to learn this stuff and I don't think you need to have an advanced degree in statistics to be a data scientist or a data science professional. You can do quite a lot just being self-taught.” Brown recommends online courses from sources such as Coursera, Udemy, or Khan Academy as approachable points of entry. He mentions, too, Medium as a platform popular for publishers of data science-related materials and, for practice, Kaggle, which hosts datasets for exposure to data wrangling and task-oriented experience. Finally, personal projects are an invaluable way to learn and explore another interest in the process. “I don't think you need to have a job in data science to learn about data science or make a switch into data science. It's a growing field right now that if you just demonstrate a genuine interest and personal effort, and you can point to a couple of things and say, I've done X, Y and Z, that's going to be really impressive.” 

Brown looks towards people’s good-willed nature to continue driving the technology in a positive direction, pointing out those who volunteer at non-profits lacking their own internal technical expertises, or those who are using AI to help solve the global energy and environment crisis, “It's not a one dimensional problem. A lot of different people need to work together on it.” And, as data-driven applications become more and more of a mainstay in every level of business, government, environment and daily life, schools, too, will take notice in terms of their STEM curriculum. Our students will certainly be a step- ahead as they move on towards university and subsequently, the professional world. Perhaps one of them will improve voice transcription algorithms, help countries get out ahead of the next potential pandemic, or possibly work on the team that develops the next-generation in digital assistant technology. 

“Hey Siri, remind me to send Brown a thank-you email.” 

See what we’ve been up to.