The power of play

A new collaboration between MIT.nano and NCSOFT, a video game development company based in South Korea, will seek to chart the future of how people interact with the world and each other via the latest science and technology for gaming.

The creation of the MIT.nano Immersion Lab Gaming Program, says Brian W. Anthony, the associate director of MIT.nano, arose out of a series of discussions with NCSOFT, a founding member of the MIT.nano industry consortium. “How do we apply the language of gaming to technology research and education?” Anthony asks. “How can we use its tools — and develop new ones — to connect different domains, give people better ways to collaborate, and improve how we communicate and interact?” 

As part of the collaboration, NCSOFT will provide funding to acquire hardware and software tools to outfit the MIT.nano Immersion Lab, the facility’s research and collaboration space for investigations in artificial intelligence, virtual and augmented reality, artistic projects, and other explorations at the intersection of hard tech and human beings. The program, set to run for four years, will also offer annual seed grants to MIT faculty and researchers for projects in science, technology, and applications of:

  • gaming in research and education;
  • communication paradigms;
  • human-level inference; and
  • data analysis and visualization.

A mini-workshop to bring together MIT principle investigators and NCSOFT technology representatives will be held at MIT.nano on April 25.

Anthony, who is also faculty lead for the MechE Alliance Industry Immersion Projects and principal research scientist in the Department of Mechanical Engineering and the Institute for Medical Engineering and Science, says the collaboration will support projects that explore data collection for immersive environments, novel techniques for visualization, and new approaches to motion capture, sensors, and mixed-reality interfaces. 

Collaborating with a gaming company, he says, comes with intriguing opportunities for new research and education paradigms. Specific topics the seed-grant program hopes to prompt include improved detection and reduction of dizziness associated with immersive headsets; automatic voice generation based on the appearance or emotional states of virtual characters; and reducing the cost and improving the accuracy of marker-less, video-based motion capture in open space. 

Another area of interest, Anthony says, is the development of new tools and techniques for detecting and learning how personal gestures communicate intent. “I gesticulate a lot when I’m talking,” he says. “What does it mean, and how do we appropriately capture and model that?” And what if, he adds, researchers could apply what they learn to empower a doctor in a clinical setting? “Maybe the doctor wants to palpate an image of tissue, which includes measures of elasticity, to see how it deforms or slice it to see how it responds. Tools that can record and visualize these gestures and reactions could be tremendously powerful.”

The collaboration’s work will involve the creation of software and the development of new hardware. The Immersion Lab Gaming Program’s leaders hope to spur a range of efforts to explore visual, acoustic, and haptic technologies. For example, Anthony says, it is possible to use sound waves that are outside the range of what is audible to humans for tactile purposes. Expressed with the right tools, these waves can actually be touched in mid-air.

The combination of hardware and software development was one of the key factors in NCSOFT’s decision to work with MIT and to become one of MIT.nano’s founding industry partners, Anthony says. And the third-floor Immersion Lab — specifically designed to facilitate immersive digital experiences — provides a flexible platform for the new program.

“Most of this building is about making or visualizing physical things at the nanometer scale,” Anthony adds. “But this space will be a magnet for people who do data-science research. We want to use the Immersion Lab to increase friction between the hardware and software folks — and I mean friction in a good way, like rubbing elbows — and to interact with the data coming from the building and to imagine the new hardware required to better interact with data, which can then be made in MIT.nano.”

MIT.nano has released the program’s first call for proposals. Applications are due May 25. 

The power of play

A new collaboration between MIT.nano and NCSOFT, a video game development company based in South Korea, will seek to chart the future of how people interact with the world and each other via the latest science and technology for gaming.

The creation of the MIT.nano Immersion Lab Gaming Program, says Brian W. Anthony, the associate director of MIT.nano, arose out of a series of discussions with NCSOFT, a founding member of the MIT.nano industry consortium. “How do we apply the language of gaming to technology research and education?” Anthony asks. “How can we use its tools — and develop new ones — to connect different domains, give people better ways to collaborate, and improve how we communicate and interact?” 

As part of the collaboration, NCSOFT will provide funding to acquire hardware and software tools to outfit the MIT.nano Immersion Lab, the facility’s research and collaboration space for investigations in artificial intelligence, virtual and augmented reality, artistic projects, and other explorations at the intersection of hard tech and human beings. The program, set to run for four years, will also offer annual seed grants to MIT faculty and researchers for projects in science, technology, and applications of:

  • gaming in research and education;
  • communication paradigms;
  • human-level inference; and
  • data analysis and visualization.

A mini-workshop to bring together MIT principle investigators and NCSOFT technology representatives will be held at MIT.nano on April 25.

Anthony, who is also faculty lead for the MechE Alliance Industry Immersion Projects and principal research scientist in the Department of Mechanical Engineering and the Institute for Medical Engineering and Science, says the collaboration will support projects that explore data collection for immersive environments, novel techniques for visualization, and new approaches to motion capture, sensors, and mixed-reality interfaces. 

Collaborating with a gaming company, he says, comes with intriguing opportunities for new research and education paradigms. Specific topics the seed-grant program hopes to prompt include improved detection and reduction of dizziness associated with immersive headsets; automatic voice generation based on the appearance or emotional states of virtual characters; and reducing the cost and improving the accuracy of marker-less, video-based motion capture in open space. 

Another area of interest, Anthony says, is the development of new tools and techniques for detecting and learning how personal gestures communicate intent. “I gesticulate a lot when I’m talking,” he says. “What does it mean, and how do we appropriately capture and model that?” And what if, he adds, researchers could apply what they learn to empower a doctor in a clinical setting? “Maybe the doctor wants to palpate an image of tissue, which includes measures of elasticity, to see how it deforms or slice it to see how it responds. Tools that can record and visualize these gestures and reactions could be tremendously powerful.”

The collaboration’s work will involve the creation of software and the development of new hardware. The Immersion Lab Gaming Program’s leaders hope to spur a range of efforts to explore visual, acoustic, and haptic technologies. For example, Anthony says, it is possible to use sound waves that are outside the range of what is audible to humans for tactile purposes. Expressed with the right tools, these waves can actually be touched in mid-air.

The combination of hardware and software development was one of the key factors in NCSOFT’s decision to work with MIT and to become one of MIT.nano’s founding industry partners, Anthony says. And the third-floor Immersion Lab — specifically designed to facilitate immersive digital experiences — provides a flexible platform for the new program.

“Most of this building is about making or visualizing physical things at the nanometer scale,” Anthony adds. “But this space will be a magnet for people who do data-science research. We want to use the Immersion Lab to increase friction between the hardware and software folks — and I mean friction in a good way, like rubbing elbows — and to interact with the data coming from the building and to imagine the new hardware required to better interact with data, which can then be made in MIT.nano.”

MIT.nano has released the program’s first call for proposals. Applications are due May 25. 

MIT receives $30 million to help address energy challenges in Egypt

MIT is the recipient of a $30 million award from the U.S. Agency for International Development (USAID), announced this week at a two-day ceremony in Cairo.

The award will support MIT over the next five years in developing a Center of Excellence in Energy at Ain Shams University, Mansoura University, and Aswan University, in Egypt. The center will serve to connect researchers at the Egyptian universities with experts at MIT, to seek innovative solutions to the country’s energy challenges.  

The Center of Excellence in Energy is one of three centers to be established in Egypt and funded by USAID through a total investment of $90 million. The centers are formal partnerships between Egyptian and American universities and the private sector to foster research, scholarships, and innovation in agriculture, energy, and water. In addition to the MIT-led center, Cornell University will partner with Cairo University, and the American University in Cairo will partner with Alexandria University, to form comparable centers to address challenges in the areas of agriculture and water, respectively.  

“The Centers will facilitate meaningful collaboration between American and Egyptian universities,” said USAID Mission Director Sherry F. Carlin, in a statement. “They will bring together some of the best minds to collectively address shared goals and challenges, spur innovative thinking, encourage private sector engagement, and strengthen government policy in the agricultural, water, and energy sectors.”

Carlin was present at the ceremony, along with USAID Administrator Mark Green and Sahar Nasr, Egypt’s minister of investment and international cooperation, as well as Khaled Abdel Ghaffar, Egypt’s minister of higher education and scientific research.

The Center of Excellence in Energy will be led by Ahmed Ghoniem, the Ronald C. Crane Professor in MIT’s Department of Mechanical Engineering, and Daniel Frey, a professor of mechanical engineering and the faculty research director for MIT D-Lab. Over the next five years, the team will work to build the research, education, and entrepreneurial capacity of Ain Shams, Mansoura, and Aswan universities  to address the country’s most pressing energy-related problems.

“Egypt is one of those places that is likely to suffer significantly from climate change,” Ghoniem says. “If we learn how to solve these problems there, we can learn to scale the solutions and use them in many other places that need them as well.”

The USAID award will enable the MIT team to bring faculty and graduate students from Egypt to the Institute, to learn how to approach large, energy-related challenges from an MIT perspective.

“The MIT modus operandi is that we integrate research and education, and translate that into entrepreneurship,” Ghoniem says. “We very much want to make that model available for Egyptian universities to emulate.”

“We’ll bring faculty and graduate students from Egypt to spend time with us, and we’ll solve problems shoulder to shoulder with them. That ‘mens-et-manus’ mentality is transmitted more effectively by immersing themselves here,” Frey adds, referring to MIT’s motto of “mind and hand.”

Ghoniem and Frey will team Egyptian faculty and students with interdisciplinary researchers across MIT, to develop renewable energy solutions to problems such as Egypt’s practice of open-field burning. The country is primarily an agricultural economy, and as such it produces a significant amount of biomass, which is often disposed of by burning the waste in open fields — a practice that generates enormous amounts of pollution and greenhouse gases.

“They call it the “black cloud,” Ghoniem says. “One of our priorities is to work with them in converting this problem into a solution, using biomass as a clean energy source in the country.”

The new center will also work toward advancing and scaling up sustainable projects that are already underway in Egypt. For instance, the country is the fourth largest user of wind energy and is currently building the largest solar parks in the world, with the goal of generating 42 percent of its electricity using renewable energy by 2025. The MIT team plans to facilitate connections between university researchers and key industrial players in the region, to expand the country’s solar, wind, and other forms of clean energy usage.

“It’s a country where so many things are going on in the energy area that match MIT’s interest in promoting and developing renewable energy technologies as well as addressing global climate change problems,” Ghoniem says.

Throughout the five-year collaboration, there will also be a special emphasis on involving Egyptian women and people with disabilities in coming up with energy-related solutions.

“We’ll be working with organizations in the country that [support] these groups, to try to pull them into the activities, and hopefully they can participate equally in education, research, and entrepreneurship,” Ghoniem says.

Facebook is free, but should it count toward GDP anyway?

For several decades, gross domestic product (GDP), a sum of the value of purchased goods, has been a ubiquitous yardstick of economic activity. More recently, some observers have suggested that GDP falls short because it doesn’t include the value of free online goods such as social media, search engines, maps, videos, and more.

A new study by MIT researchers puts a dollar value on all those free digital goods people use, and builds the case that online activity can and should become part of GDP some day.

For instance, Facebook is worth about $40 to $50 per month for U.S. consumers, according to a series of surveys the researchers conducted. In Europe, digital maps on phones are valued at 59 euros (currently about $67) per month. And the free messaging tool WhatsApp, used most widely outside the U.S., is worth a whopping 536 euros ($611) per month, the survey indicates. 

“The magnitude of the numbers was really striking,” says Avinash Collis, a doctoral candidate in information technologies at the MIT Sloan School of Management, who helped develop the new study.

Or, as the scholars write in a new paper summarizing the results, “digital goods have created large gains in well-being that are not reflected in conventional measures of GDP and productivity.”

The paper, “Using massive online choice experiments to measure changes in well-being,” appears today in Proceedings of the National Academy of Sciences. In addition to Collis, the authors are Erik Brynjolfsson, the Schussel Family Professor of Management at MIT Sloan, and Felix Eggers, an assistant professor of economics at the University of Gronigen in the Netherlands.

Ask the people what they want

To conduct the study, the researchers used three large-scale online surveys in which consumers were asked to put a price tag on the free online services they consume. In many cases, respondents were asked whether they would prefer to keep using a free online good, or to name a price that would compensate for losing access to that product. All told, the surveys drew about 65,000 responses.

“The best way to value these digital goods is to go to people directly and ask them,” Collis says.

The study produced a number of distinctive findings regarding online services and specific companies. For instance, consumers placed an average annual value of $1,173 on online video streaming services such as YouTube and Netflix. To be sure, these video platforms, among others, do charge fees to some consumers — although those are typically $10 to $20 per month.

Either way — free or with modest charges — the surveys reveal that online video use generates a significant amount of “consumer surplus,” that is, the value for consumers beyond the prices they pay. In these cases, online video providers “create a lot more value then they capture,” Collis says.

The study also revealed the huge value that consumers place on certain categories of online goods. For instance, people valued search engines at an average of $17,530 per year, and email at $8,414. Collis suggests those numbers may appear so high because many people use search engines and email both at work and in leisure time, and use both factors to assess the overall value.

Regarding specific companies and products, the surveys found that consumers who use YouTube or Instagram place a lower value on Facebook. Women place a higher value on Facebook than men do, while households with an income between $100,000 and $150,000 place less value on Facebook than both lower-income and higher-income households.

Mend it, don’t end it

The current study is the latest serious effort to reassess the common use of GDP. Critics have long suggested we rely too heavily on GDP as an indicator of overall well-being, since there is more to life than economic production.

In a separate but related critique, some observers — and many Silicon Valley technologists — have been contending in recent years that free online products were neglected by GDP. Those free goods can also be thought to add to our overall “well-being,” in theory.

Certainly there are good reasons to think a refinement of GDP along the study’s lines could be an improvement. Even as the use of computing technology has grown massively, the information sector has remained between 4 percent and 5 percent of U.S. GDP from the early 1980s until 2016.

For their part, the authors regard the current paper as just one part of a bigger research program concerning GDP. As part of their ongoing work, they are attempting to arrive at a large-scale number summarizing the value of products currently overlooked by standard GDP measures, and produce an alternate version of GDP. That new figure, Collis says, could usefully supplement our measuring tools for national economies.

“GDP is a great measure of production,” Collis says. “We should not replace it.” However he adds, “In parallel, we should also be measuring economic well-being [in ways that] account for new and free goods.”

3 Questions: Why are student-athletes amateurs?

Debate about the unpaid status of NCAA athletes has surged in the last decade — and did so again last month when the best player in men’s college basketball, Zion Williamson, got injured in a high-profile game. Meanwhile, graduate student unionization drives frequently raise the same question: Aren’t some students also workers creating value for universities? And how did we come to regard student-athletes, say, as amateurs in the first place?

Jennifer Light, the Bern Dibner Professor in the History of Science and Technology and a professor of urban studies and planning, has just published an article in the Harvard Educational Review on the history of this idea that students are not part of the labor force. She places its origins in the 1890-1930 movement to expand public schooling, which promoted schools as alternatives to child labor and put them forth as “protected” places for young people to focus on future-oriented training. MIT News talked to Light about her research. This interview has been edited for length.

Q: How did you become interested in the topic of value-producing students, and the question of whether or not they’re fairly compensated?

A: Previously, I taught at Northwestern University, where I encountered many student-athletes because two of my classes covered some sports history. On more than one occasion, someone raised the question, “Why are we not getting compensated when we bring in so much money [for the university]?” Or I’d hear anecdotally about how a video game company came to scan their bodies for a game that was “cool,” but also not something they got paid for.

That got me thinking about unpaid labor. Of course, student-athletes receive scholarships, but those are quite limited when compared with the compensation they’d get playing outside of school. So I went looking for the origins of the idea that students are not part of the labor force. As it turns out, this idea dates to the emergence of mass schooling in the United States. As it also turns out, there is a long history of schools profiting from student activities — and not just sports.

Q: The “alternative history” of students you describe primarily occurs from about 1890 to 1930, with the movement to make public education available for everyone. What happened in this time period that was so important, in this regard?

A: Public education became popular, along with compulsory-schooling legislation, largely because of the industrial economy. The spread of schools was part of a national effort to train children for future industrial jobs and reduce child labor. And at some level, yes, when kids went to school, they were protected from going into factories or coal mines.

On the other hand, because so many public schools needed to get off the ground at the same time and local governments did not have adequate resources, educators assigned pupils to build and operate their schools: making desks and lockers, building playground equipment and gyms, keeping financial records, running the lunch room, everything from ordering supplies to cooking and serving the meal. Kids repaired school plumbing and heating systems, did health inspections, and tracked down truants.

This was celebrated as the cutting-edge curriculum, the “new education” for the industrial age. John Dewey said when you bring the school close to life, that motivates students’ learning. Because these things were done for educational purposes and no money exchanged hands, they were not considered “work.” Of course, if kids did the same tasks in the “real world,” they would be paid. We still use this language of school versus the real world today. So this mindset originated in public schools and only later carried over into education for older adolescents, which was less common, particularly before World War II.

Q: How and when did the idea of the “protected” student make the leap from public schools to universities?

A: In the American mindset, until about 1930, you were a kid until you were between about 14 and 16, because in most places high school was not compulsory; education was compulsory until the eighth grade. And people were fighting to change this, but there were plenty of late teenagers in the work force.

Mass unemployment during the Great Depression was a catalyst for extending this protective period to older adolescents, through their early 20s. [The thinking was] that adults should be top priority for available jobs. So although increasingly specialized jobs were a contributing factor, the desire not to compete with adults was a major force behind the expansion of training for this age group — in high schools, community colleges, universities, and specialized programs such as those sponsored by the National Youth Administration. As with the curriculum for younger pupils, institutional maintenance was a feature of these programs. 

In recent years this sort of routine economic activity inside schools has declined but not disappeared. Today’s controversies around student-athletes and teaching assistants stem in part from the century-old assumption that students by definition are cultivating their human capital and defering economic participation until they graduate to the “real world.” What I’m trying to show is, that’s always been a fantasy. Of course when students go to school, they get an education. But they could also be producing value for their institutions. 

Gulf Stream series wins Knight Science Journalism Program’s Inaugural Victor K. McElheny Award

The Knight Science Journalism Program at MIT has announced that the inaugural Victor K. McElheny Award for local and regional science journalism will go to a team of reporters from the Charleston Post and Courier, for an investigative series that shed light on a little-known impact of climate change and an overlooked risk of offshore drilling in the eastern U.S.

The series featured a captivating piece by Tony Bartelme that took readers “into” the Gulf Stream, the powerful system of currents that carries warm tropical water up the U.S. East Coast to the Arctic. Weaving the story of a 1969 submarine expedition with the more recent story of an unexpected Gulf Stream slowdown, Bartelme expertly conveyed both the current’s might and its fragility in the face of climate change. In a data-driven companion piece, Bartelme and Emory Parker used more than 1,000 simulations to paint a startling picture of how the Gulf Stream could complicate efforts to contain spills from offshore drilling operations — a salient concern now that some lawmakers are pushing to open the East Coast to drilling. And in a mark of the team’s innovative approach to audience engagement, the series included an adult coloring book: “30 Days in the Gulf Stream,” designed by Bartelme and Chad Dunbar.

“It was really well done and creative — an unexpected story told with great storytelling technique,” remarked a member of the judging panel. “The topic was fresh, and it had real impact.” National environmental groups described the team’s work as “stunning,” and the series helped energize the drilling debate ahead of South Carolina’s 2018 elections.

In addition to the Post and Courier series, judges honored two other outstanding entries as finalists: The Seattle Times series Hostile Waters, a gut-wrenching story of how hunting, pollution, and other human activities have caused the population of Southern Resident Orcas in Puget Sound to dwindle toward extinction; and The Last Grove, a Tampa Bay Times feature that recounts the closing of Hillsborough County’s last commercial orange grove, a victim of Florida’s citrus greening epidemic. The three honorees rose to the top of a competitive field that included more than 100 entries from newspapers, magazines, and radio stations across the U.S.

Named after the Knight Science Journalism Program’s founding director, the Victor K. McElheny Award was established to honor outstanding coverage of science, public health, technology, and environmental issues at the local and regional level. “The local newspaper and radio station are where many people get the news that matters to them the most, and sadly, a lot of good science reporting at these outlets goes unnoticed,” said Deborah Blum, director of the Knight Science Journalism Program. “So it was really encouraging to see the quality, breadth, and depth of science coverage in this year’s entries — and to see that these stories are having real impacts in their communities.”

The winning team from the Post and Courier will be honored at a luncheon ceremony at MIT’s Samberg Center on Wednesday, April 17.

The McElheny Award is made possible by generous support from Victor K. McElheny, Ruth McElheny, and the Rita Allen Foundation. The award’s judges and screeners include Brian Bergstein (freelance journalist), Magnus Bjerg (TV 2, Denmark), Alicia Chang (Associated Press), Jason Dearen (Associated Press), Lisa De Bode (freelance journalist), Gideon Gil (STAT), Elana Gordon (WHYY), and Barbara Moran (WBUR).

2019 McElheny Award honorees

Winner:

Charleston Post and Courier (Tony Bartelme, Chad Dunbar, and J. Emory Parker)

A powerful current just miles from SC is changing. It could devastate the East Coast.

If oil spilled off SC’s coast, a huge current would make it ‘impossible to control’

A massive current off Charleston’s coast is changing

Finalists:

Seattle Times (Lynda V. Mapes, Steve Ringman, Emily Eng, Lauren Frohne, and Ramon Dompor)

Hostile Waters

The orca and the orca catcher: How a generation of killer whales was taken from Puget Sound

To catch an orca

Tampa Bay Times (Lisa Gartner)

Florida scientists are working to solve greening. They were too late for Cee Bee’s.

The Knight Science Journalism Program at MIT, founded more than 30 years ago, seeks to nurture and enhance the ability of journalists from around the world to accurately document and illuminate the often complex intersection of science, technology and human culture. It does so through an acclaimed fellowship program — which hosts 10 or more journalists every academic year — and also through science-focused seminars, skills-focused master classes, workshops, and publications.

Since it began, the program has hosted more than 300 fellows, who continue to cover science across a range of platforms in the United States, including The New York Times, The Wall Street Journal, Forbes, Time, Scientific American, Science, the Associated Press, and broadcast outlets ranging from ABC News to CNN, as well as in numerous other countries.

Tim Berners-Lee named FT “Boldness in Business” Person of the Year

The week that his invention of the World Wide Web turned 30, MIT professor Sir Tim Berners-Lee has been named the Financial Times’ Person of the Year in their special “Boldness in Business” issue.

Berners-Lee was honored for his new startup inrupt, which emerged out of work at the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) developing the open-data platform Solid.

Solid aims to give users ownership over their data by building decentralized social applications.

“Right now we really have the worst of all worlds, in which people not only cannot control their data, but also can’t really use it, because it’s spread across a number of different silo-ed websites,” says Berners-Lee. “Our goal is to ‘re-decentralize the web’ and develop a web architecture that gives users more control over the information they provide to applications.”

Solid has produced some 50,000 so-called personal online data stores (PODs) that are being experimented on by thousands of developers across more than 25 countries. His company is also collaborating with partners like UK’s National Health Service to explore growing the scale of Solid, and intends to launch a user product by the end of the year.

In the FT article, Berners-Lee acknowledges the challenges of breaking through with a new paradigm in a climate where companies have vested interests in maintaining their data ecosystem. But he retains a healthy optimism that recent concerns about data privacy have created more momentum for a project like this.

“It is rocket science. It is tricky. Things can blow up on you,” Berners-Lee told FT. “But we know how to fire rockets into the sky. We should be able to build constructive social networks.”

Besides his responsibilities at CSAIL, Berners-Lee is director of the World Wide Web Consortium, which develops web standards, specifications, and tool, as well as director of the World Wide Web Foundation, which does advocacy related to “a free and open web for everyone.”

He is the 3Com Founders Professor of Engineering in the Department of Electrical Engineering and Computer Science at MIT as well as a recipient of the A.C.M. Turing Award, often described as “the Nobel Prize of computing,” for inventing the web and developing the protocols that spurred its global use.

“Tim’s contributions to computer science have fundamentally transformed the world, and his more recent work with inrupt is poised to do the same,” says CSAIL Director Daniela Rus. “All of us at the lab — and MIT more broadly — are so very proud of him and excited to see how his efforts will continue to impact the way that people use and share data.”

Ethics, Computing, and AI: Perspectives from MIT

The MIT Stephen A. Schwarzman College of Computing will reorient the Institute to bring the power of computing and AI to all fields at MIT; allow the future of computing and AI to be shaped by all MIT disciplines; and advance research and education in ethics and public policy to help ensure that new technologies benefit the greater good.

To support ongoing planning for the new college, Dean Melissa Nobles invited faculty from all five MIT schools to offer perspectives on the societal and ethical dimensions of emerging technologies. This series presents the resulting commentaries — practical, inspiring, concerned, and clear-eyed views from an optimistic community deeply engaged with issues that are among the most consequential of our time. 

The commentaries represent diverse branches of knowledge, but they sound some common themes, including: the vision of an MIT culture in which all of us are equipped and encouraged to discern the impact and ethical implications of our endeavors.

FOREWORD
Ethics, Computing, and AI  
Melissa Nobles, Kenan Sahin Dean, and Professor of Political Science
School of Humanities, Arts, and Social Sciences

“These commentaries, representing faculty from all five MIT schools, implore us to be collaborative, foresighted, and courageous as we shape a new college — and to proceed with judicious humility. Rightly so. We are embarking on an endeavor that will influence nearly every aspect of the human future.” Read more >>

INTRODUCTION
The Tools of Moral Philosophy
Caspar Hare, Professor of Philosophy
Kieran Setiya, Professor of Philosophy
School of Humanities, Arts, and Social Sciences

“We face ethical questions every day. Philosophy does not provide easy answers for these questions, nor even fail-safe techniques for resolving them. What it does provide is a disciplined way to think about ethical questions, to identify hidden moral assumptions, and to establish principles by which our actions may be guided and judged. Framing a discussion of the risks of advanced technology entirely in terms of ethics suggests that the problems raised are ones that can and should be solved by individual action. In fact, many of the challenges presented by computer science will prove difficult to address without systemic change.”

Action: Moral philosophers can serve both as teachers in the new College and as advisers/consultants on project teams. Read more >>

WELCOMING REMARKS
A New Kind of Education
Susan Silbey, Chair of the MIT Faculty
Celebration for the MIT Schwarzman College of Computing
28 February 2018

“The college of computing will be dedicated to educating a different kind of technologist. We hope to integrate computing with just about every other subject at MIT so that students leave here with the knowledge and resources to be wiser, more ethically and technologically competent citizens and professionals.” Read more >>

Part I: A Human Endeavor
Computing is embedded in cultural, economic, and political realities.

Computing is Deeply Human
Stefan Helmreich, Elting E. Morison Professor of Anthropology
Heather Paxson, William R. Kenan, Jr. Professor of Anthropology
School of Humanities, Arts, and Social Sciences

“Computing is a human practice that entails judgment and is embedded in politics. Computing is not an external force that has an impact on society; instead, society — institutional structures that organize systems of social norms — is built right into making, programming, and using computers.”

Action: The computational is political; MIT can make that recognition one of the pillars of computing and AI research. Read more >>

When Computer Programs Become Unpredictable
John Guttag, Dugald C. Jackson Professor of Computer Science and Electrical Engineering
School of Engineering

“We should look forward to the many good things machine-learning will bring to society. But we should also insist that technologists study the risks and clearly explain them. And society as whole should take responsibility for understanding the risks and for making human-centric choices about how best to use this ever-evolving technology.”

Action: Develop platforms that enable a wide spectrum of society to engage with the societal and ethical issues of new technology. Read more >>

Safeguarding Humanity in the Age of AI
Bernhardt Trout, Raymond F. Baddour Professor of Chemical Engineering
School of Engineering

“There seem to be two possibilities for how AI will turn out. In the first, AI will do what it is on track to do: slowly take over every human discipline. The second possibility is that we take the existential threat of AI with the utmost seriousness and completely change our approach. This means redirecting our thinking from a blind belief in efficiency to a considered understanding of what is most important about human life.” Read more >>

Action: Develop a curriculum that encourages us to reflect deeply on fundamental questions: What is justice? How ought I to live?

II. COMMUNITY INSIGHTS
Shaping ethical technology is a collective responsibility.

The Common Ground of Stories
Mary Fuller, Professor of Literature, and Head MIT Literature section
School of Humanities, Arts, and Social Science

“Stories are things in themselves, and they are also things to think with. Stories allow us to model interpretive, affective, ethical choices; they also become common ground. Reading about Milton’s angelic intelligences or William Gibson’s “bright lattices of logic” won’t tell us what we should do with the future, but reading such stories at MIT may offer a conceptual meeting place to think together across the diversity of what and how we know.”

Action: Create residencies for global storytellers in the MIT Schwarzman College of Computing. Read more >>

Who’s Calling the Shots with AI?
Leigh Hafrey, Senior Lecturer, Leadership and Ethics
MIT Sloan School of Management

“‘Efficiency’ is a perennial business value and a constant factor in corporate design, strategy, and execution. But in a world where the exercise of social control by larger entities is real, developments in artificial intelligence have yet to yield the ethics by which we might manage their effects. The integrity of our vision for the future depends on our learning from the past and celebrating the fact that people, not artifacts and institutions, set our rules of engagement.”

Action: Adopt a full-on stakeholder view of business in society and the individual in business. Read more >>

In Praise of Wetware
Caroline A. Jones, Professor of Art History
School of Architecture and Planning

“As we enshrine computation as the core of smartness, we would be well advised to think of the complexity of our ‘wet’ cognition, which entails a much more distributed notion of intelligence that goes well beyond the sacred cranium and may not even be bounded by our own skin.”

Action: Before claiming that it is “intelligence” we’ve produced in machines or modeled in computation, we should better understand the adaptive, responsive human wetware — and its dependence on a larger living ecosystem. Read more >>

Blind Spots
David Kaiser, Germeshausen Professor of the History of Science, and Professor of Physics
School of Humanities, Arts, and Social Sciences, and Department of Physics

“MIT has a powerful opportunity to lead in the development of new technologies while also leading careful, deliberate, broad-ranging, and ongoing community discussions about the “whys” and ‘what ifs,’ not just the ‘hows.’ No group of researchers, flushed with the excitement of learning and building something new, can overcome the limitations of blind spots and momentum alone.”

Action: Create ongoing forums for brainstorming and debate; we will benefit from engaging as many stakeholders as possible. Read more >>

Assessing the Impact of AI on Society
Lisa Parks, Professor of Comparative Media Studies
School of Humanities, Arts, and Social Sciences

“Three fundamental societal challenges have emerged from the use of AI, particularly for data collection and machine learning. The first challenge centers on this question: Who has the power to know about how AI tools work, and who does not? A second challenge involves learning how AI tools intersect with international relations and the dynamics of globalization. Beyond questions of knowledge, power, and globalization, it is important to consider the relationship between AI and social justice.”

Action: Conduct a political, economic, and materialist analysis of the relationship of AI technology to global trade, governance, natural environments, and culture. Read more >>

Clues and Caution for AI from the History of Biomedicine
Robin Wolfe Scheffler, Leo Marx Career Development Professor in the History and Culture of Science and Technology
School of Humanities, Arts, and Social Sciences

“The use of AI in the biomedical fields today deepens longstanding questions raised by the past intractability of biology and medicine to computation, and by the flawed assumptions that were adopted in attempting to make them so. The history of these efforts underlines two major points: ‘Quantification is a process of judgment and evaluation, not simple measurement’ and ‘Prediction is not destiny.'”

Action: First, understand the nature of the problems we want to solve — which include issues not solvable by technical innovation alone. Let that knowledge guide new AI and technology projects. Read more >>

The Environment for Ethical Action
T.L. Taylor, Professor of Comparative Media Studies
School of Humanities, Arts, and Social Sciences

“We can cultivate our students as ethical thinkers but if they aren’t working in (or studying in) structures that support advocacy, interventions, and pushing back on proposed processes, they will be stymied. Ethical considerations must include a sociological model that focuses on processes, policies, and structures and not simply individual actors.”

Action: Place a commitment to social justice at the heart of the MIT Schwarzman College of Computing. Read more >>

Biological Intelligence and AI
Matthew A. Wilson, Sherman Fairchild Professor of Neuroscience
School of Science and the Picower Institute

“An understanding of biological intelligence is relevant to the development of AI, and the effort to develop artificial general intelligence (AGI) magnifies its significance. AGIs will be expected to conform to standards of behavior…Should we hold AIs to the same standards as the average human? Or will we expect AIs to perform at the level of an ideal human?”

Action: Conduct research on how innate morality arises in human intelligence, as an important step toward incorporating such a capacity into artificial intelligences. Read more >>

Machine Anxiety
Bernardo Zacka, Assistant Professor of Political Science
School of Humanities, Arts, and Social Sciences

“To someone who studies bureaucracy, the anxieties surrounding AI have an eerily familiar ring. So too does the excitement. For much of the 20th century, bureaucracies were thought to be intelligent machines. As we examine the ethical and political implications of AI, there are at least two insights to draw from bureaucracy’s history: That it is worth studying our anxieties whether or not they are realistic; and that in doing so we should not write off human agency.”

Action: When societies undergo deep transformations, envisioning a future that is both hopeful and inclusive is a task that requires moral imagination, empathy, and solidarity. We can study the success of societies that have faced such challenges well. Read more >>

Part III: A Structure for Collaboration
Thinking together is powerful.

Bilinguals and Blending
Hal Abelson, Class of 1922 Professor of Electrical Engineering and Computer Science
School of Engineering

“When we study society today, we can no longer separate humanities — the study of what’s human — from computing. So, while there’s discussion under way about building bridges between computing and the humanities, arts, and social sciences, what the College of Computing needs is blending, not bridging. MIT’s guideline should be President Reif’s goal to ‘educate the bilinguals of the future’ —experts in many fields who are also skilled in modern computing.”

Action: Develop approaches for joint research and joint teaching. Read more >>

A Dream of Computing
Fox Harrell, Professor of Digital Media and Artificial Intelligence
School of Humanities, Arts, and Social Sciences + Computer Science and Artificial Intelligence Lab

“There are numerous perspectives on what computing is: some people focus on theoretical underpinnings, others on implementation, others still on social or environmental impacts. These perspectives are unified by shared characteristics, including some less commonly noted: computing can involve great beauty and creativity.”

Action: “We must reimagine our shared dreams for computing technologies as ones where their potential social and cultural impacts are considered intrinsic to the engineering practices of inventing them.” Read more >>

A Network of Practitioners
Nick Montfort, Professor of Media Studies
School of Humanities, Arts, and Social Sciences

“Computing is not a single discipline or even a set of disciplines; it is a practice. The new College presents an opportunity for many practitioners of computing at MIT.”

Action: Build a robust network with many relevant types of connections, not all of them through a single core. Read more >>

Two Commentaries
Susan Silbey, Chair of the MIT Faculty
Goldberg Professor of Humanities, Professor of Sociology and Anthropology, and Professor of Behavioral and Policy Sciences
School of Humanities, Arts, and Social Sciences and MIT Sloan School of Management

How Not To Teach Ethics  — “Rather than thinking about ethics as a series of anecdotal instances of problematic choice-making, we might think about ethics as participation in a moral culture, and then ask how that culture supports or challenges ethical behavior.”

Forming the College  — “The Stephen A. Schwarzman College is envisioned to be the nexus connecting those who advance computer science, those who use computational tools in specific subject fields, and those who analyze and write about digital worlds.” Read more >>

Ethical AI by Design
Abby Everett Jaques, Postdoctoral Associate, Philosophy
School of Humanities, Arts, and Social Sciences

“We are teaching an ethical protocol, a step-by-step process that students can use for their own projects. In this age of self-driving cars and machine learning, the questions feel new, but in many ways they’re not. Philosophy offers powerful tools to help us answer them.” Read more >>

Series prepared by MIT SHASS Communications
Office of the Dean, MIT School of Humanities, Arts, and Social Sciences
Series Editors: Emily Hiestand, Kathryn O’Neill

MIT celebrates 50th anniversary of historic moon landing

On Sept. 12, 1962, in a speech given in Houston to pump up support for NASA’s Apollo program, President John F. Kennedy shook a stadium crowd with the now-famous quote: “We choose to go to the moon in this decade and do the other things, not because they are easy, but because they are hard.”

As he delivered these lines, engineers in MIT’s Instrumentation Laboratory were already taking up the president’s challenge. One year earlier, NASA had awarded MIT the first major contract of the Apollo program, charging the Instrumentation Lab with developing the spacecraft’s guidance, navigation, and control systems that would shepherd astronauts Michael Collins, Buzz Aldrin, and Neil Armstrong to the moon and back.

On July 20, 1969, the hard work of thousands paid off, as Apollo 11 touched down on the lunar surface, safely delivering Armstrong and Aldrin ScD ’63 as the first people to land on the moon.

On Wednesday, MIT’s Department of Aeronautics and Astronautics (AeroAstro) celebrated the 50th anniversary of this historic event with the daylong symposium “Apollo 50+50,” featuring former astronauts, engineers, and NASA adminstrators who examined the legacy of the Apollo program, and MIT faculty, students, industry leaders, and alumni who envisioned what human space exploration might look like in the next 50 years.

In welcoming a large audience to Kresge Auditorium, some of whom sported NASA regalia for the occasion, Daniel Hastings, head of AeroAstro, said of today’s prospects for space exploration: “It’s the most exciting time since Armstrong and Aldrin landed on the moon.”

The event kicked off three days of programming for MIT Space Week, which also included the Media Lab’s “Beyond the Cradle: Envisioning a New Space Age” on March 14, and the student-led “New Space Age Conference” on March 15.

“We could press on”

As a “baby boomer living through Apollo,” retired astronaut Charles Bolden, NASA’s 12th administrator, said the Apollo program illustrated “how masterful we were at overcoming adversity.” In a keynote address that opened the day’s events, Bolden reminded the audience that, at the time the ambitious program got underway in the 1960s, the country was in the violent thick of the civil rights movement.

We were killing each other in the streets,” Bolden said. “And yet we had an agency like NASA, and a small group of people, who were able to bear through everything and land on the moon. … We could recognize there were greater things we could do as a people, and we could press on.”

For MIT’s part, the push began with a telegram on Aug. 9, 1961, to Charles Stark Draper, director of the Instrumentation Laboratory, notifying him that NASA had selected the MIT lab “to develop the guidance navigation system of the Project Apollo spacecraft.” Draper, who was known widely as “Doc,” famously assured NASA of MIT’s work by volunteering himself as a crew member on the mission, writing to the agency that “if I am willing to hang my life on our equipment, the whole project will surely have the strongest possible motivation.”

This of course proved unnecessary, and Draper went on to lead the development of the guidance system with “unbounded optimism,” as his former student and colleague Lawrence Young, the MIT Apollo Program Professor, recalled in his remarks.

“We owe the lighting of our fuse to Doc Draper,” Young said.

At the time that MIT took on the Apollo project, the Instrumentation Laboratory, later renamed Draper Laboratory, took up a significant footprint, with 2,000 people and 15 buildings on campus, dedicated largely to the lunar effort.

“The Instrumentation Lab dwarfed the [AeroAstro] department,” said Hastings, joking, “it was more like the department was a small pimple on the Instrumentation Lab.”

Apollo remembered

In a highlight of the day’s events, NASA astronauts Walter Cunningham (Apollo 7) and Charles Duke SM ’64 (Apollo 16), and MIT Instrumentation Laboratory engineers Donald Eyles and William Widnall ’59, SM ’62 — all from the Apollo era — took the stage to reminisce about some of the technical challenges and emotional moments that defined the program.

One of the recurring themes of their conversation was the observation that things simply got done faster back then. For instance, Duke remarked that it took just 8.5 years from when Kennedy first called for the mission, to when Armstrong’s boots hit the lunar surface.

“I would argue the proposal for such a mission would take longer [today],” Duke said to an appreciative rumble from the audience.

The Apollo Guidance Computer, developed at MIT, weighed 70 pounds, consumed 55 watts of power — half the wattage of a regular lightbulb — and took up less than 1 cubic foot inside the spacecraft. The system was one of the first digital flight computers, and one of the first computers to use integrated circuits.  

Eyles and Widnall recalled in detail the technical efforts that went into developing the computer’s hardware and software. “If you’re picturing [the computer code] on a monitor, you’d be wrong,” Eyles told the audience. “We were writing the program on IBM punch cards. That clunking mechanical sound of the key-punch machine was the soundtrack to creating the software.”

Written out, that code famously amounted to a stack of paper as tall as lead software engineer Margaret Hamilton — who was not able to participate in Wednesday’s panel but attended the symposium dinner that evening.

In the end, the Apollo Guidance Computer succeeded in steering 15 space flights, including nine to the moon, and six lunar landings. That’s not to say that the system didn’t experience some drama along the way, and Duke, who was the capsule communicator, or CAPCOM, for Apollo 11, remembers having to radio up to the spacecraft during the now-famous rocky landing.

“When I heard the first alarm go off during the braking phase, I thought we were dead in the water,” Duke said of the first in a series of alerts that the Apollo astronauts reported, indicating that the computer was overloaded, during the most computationally taxing phase of the mission. The spacecraft was several miles off course and needed to fly over a “boulder field,” to land within 60 seconds or risk running out of fuel.

Flight controllers in Houston’s Mission Control Center determined that if nothing else went wrong, the astronats, despite the alarms, could proceed with landing.

“Tension was high,” Duke said of the moment. “You didn’t want to touch down on a boulder and blow a nozzle, and spoil your whole day.”

When the crew finally touched down on the Sea of Tranquility, with Armstrong’s cool report that “the Eagle has landed,” Duke, too wound-up to properly verbalize the callback “Tranquility,” recalls “I was so excited … it came out as ‘Twang,’ or something like that.’ The tension — it was like popping a balloon.”

Since the Apollo era, NASA has launched astronauts on numerous missions, many of whom are MIT graduates. On Wednesday, 13 of those graduates came onstage to be recognized along with the Apollo crew.

In introducing them to the audience, Jeffrey Hoffman, a former astronaut and now AeroAstro professor of the practice, noted MIT’s significant representation in the astronaut community. For instance, in the five missions to repair the Hubble Space Telescope, which comprised 24 spacewalks, 13 of those were performed by MIT graduates.

“That’s pretty cool,” Hoffman said.

On the horizon

The Apollo moon rocks that were were brought back to Earth have “evolved our understanding of how the moon formed,” said Maria Zuber, MIT’s vice president for research and the E.A. Griswold Professor of Geophysics in the Department of Earth, Atmospheric and Planetary Sciences. These rocks “vanquished” the idea that the moon originally formed as a cold assemblage of rocks and “foo foo dust,” she said.

Instead, after carefully analyzing samples from Apollo 11 and other missions, scientists at MIT and elsewhere have found that the moon was a dynamic body, with a surface that at one time was entirely molten, and a metallic core, or “dynamo,” powering an early, lunar magnetic field. Even more provocative was the finding that the moon was not in fact “bone-dry,” but actually harbored water — an idea that Zuber said was virtually unpublishable until an MIT graduate reported evidence of water in Apollo samples, after which the floodgates opened in support of the idea.

To consider the next 50 years of space exploration, the MIT symposium featured a panel of faculty members — Paulo Lozano, Danielle Wood, Richard Binzel, and Sara Seager — who highlighted, respectively, the development of tiny thrusters to power miniature spacecraft; an effort to enable wider access to microgravity missions; an MIT student-designed mission (REXIS) that is currently analyzing the near-Earth asteroid Bennu; and TESS and ASTERIA, satellite missions that are currently in orbit, looking for planets and possibly, life, outside our solar system.

Industry leaders also weighed in on the growing commercialization of space exploration, in a panel featuring MIT alums who currently head major aerospace companies.

Keoki Jackson, chief technology officer of Lockheed Martin, noted the pervasiveness of space-based technologies, such as GPS-dependent apps for everything from weather and news, to Uber.

“[Commercial enterprises] have made space a taken-for-granted part of life,” said Jackson, noting later in the panel that in 2015, 1 billion GPS devices had been sold around the world. “This shows you what can happen exponentially when you come up with something truly enabling.”

“The challenge we face is talent, and in particular, diversity,” said John Langford, CEO and founder of Aurora Flight Sciences, who noted the panel’s all-male participants as an example. “It’s an industry-wide challenge. We’re working to reform ourselves, as we move from the brigade-type technologies that we grew up with, to incorporating technologies such as computer technology and artificial intelligence.”

Future missions

In a glimpse of what the future of space exploration might hold, MIT students presented lightning talks on a range of projects, including a custom-designed drill to excavate ice on Mars, a system that makes oxygen on Mars to fuel return missions to Earth, and a plan to send CubeSats around the world to monitor water vapor as a measure of climate change.

Audience members voted online for the best pitch, which ultimately went to Raichelle Aniceto and her presentation of a CubeSat-enabled laser communications system designed to transmit large amounts of data from the moon to Earth in just five minutes.

In the last keynote address of the symposium, Thomas Zubuchen, associate administrator for NASA’s Science Mission Directorate, told the audience that there is still a lot of research to be done on the moon, which he said is changing, as evidenced by new craters that have formed in the last 50 years.

“The moon of the Apollo era is not the same moon of today,” said Zurbuchen, who noted that just this week, NASA announced it will open previously unlocked samples of soil collected by the Apollo missions.

In closing the symposium, Dava Newman, the Apollo Program Professor of Astronautics and former NASA deputy administrator, envisioned a future dedicated to sending humans back to the moon, and ultimately to Mars.

“I’m a rocket scientist. I got here because of Apollo, and Eleanor Roosevelt said it best: Believe in the beauty of your dreams,” Newman said. “The challenge is, within 50 years, to be boots on Mars. I think we have the brains and the doers and inspiration to really make that happen.”

Microgravity research after the International Space Station

For nearly 20 years, the International Space Station (ISS) has served as a singular laboratory for thousands of scientists, students, and startups around the world, who  have accessed the station’s microgravity environment to test how being in space impacts everything from cancer cells and human tissues to zucchini and barley seeds — not to mention a host of living organisms including flatworms, ants, geckos, and bobtail squids.

Indeed, the ISS “has operated as a bastion of international cooperation and a unique testbed for microgravity research,” write MIT engineers in a paper they presented on March 8 at the IEEE Aerospace Conference in Montana. But the ISS will eventually be retired in its current form. NASA is preparing to transition the focus of its human space flight activities to the Moon, and the international partners that manage the ISS are discussing how to transition out of the current operational model.

As NASA explores options for commercial entities to operate research platforms in orbit around Earth, and while other public and private entities consider alternative designs for microgravity facilities, the MIT team says it’s important to keep affordable access to such facilities at the forefront of these discussions. In their paper, the researchers argue that scientists from any country should be able to participate in microgravity research.

Toward that end, the team has developed a tool for evaluating the accessibility of various “governance models,” such as facilities that are controlled by mostly governments or private entities, or a mixture of both.

MIT News checked in with the researchers about the future of microgravity research and how openness can drive innovation and collaboration in space. Christine Joseph is a graduate student in MIT’s Department of Aeronautics and Astronautics and the Technology and Policy Program. Danielle Wood is the Benesse Corporation Career Development Assistant Professor of Research in Education within MIT’s Program in Media Arts and Sciences and jointly appointed in the Department of Aeronautics and Astronautics. She is also founder of the Space Enabled Research Group within the MIT Media Lab, whose mission is to advance justice in Earth’s complex systems using designs enabled by space.

Q: Why is affordable access important, particularly for space-based microgravity research?

Wood: Participation in space-based microgravity research should be an opportunity open to researchers from every nation because space is a global commons that does not belong to a single nation. As stated in the Outer Space Treaty, ratified by over 100 countries, “the exploration and use of outer space … shall be carried out for the benefit and in the interests of all countries … and shall be the province of all [hu]mankind.”

Studies in the microgravity environment bring new knowledge about the human body, plants, animals, materials, physics, manufacturing, and medicines. This knowledge can contribute to sustainable development when it is translated into Earth-based applications, such as when knowledge of astronaut exercise routines informs recovery procedures for patients facing long periods of bedrest, or when experiments about the physics of combustion yield results that can improve fire safety on Earth.

When a larger variety of researchers from around the world participate in microgravity research, the scientific community benefits from the broader range of research outcomes. Participation in microgravity research also helps countries that do not yet have experience in space build local capability to design and operate space-based experiments.

Q: How does your new tool evaluate accessibility to microgravity research facilities?

Joseph: We propose that accessibility can be measured using the metrics of economic and administrative openness. Economic openness is based on the financial costs paid by researchers to perform all the activities involved with completing a microgravity research project. This includes the costs associated with designing an experiment, engineering it to be safe and functional, launching it to space, accessing a facility that provides environmental control, data and power, operating the experiment, and possibly returning it to Earth.

Administrative openness refers to the type of gatekeeping that directly and indirectly determines who can participate. For example, today administrative procedures influence access depending on the nationality or type of organization the user comes from and the type of microgravity activity they are seeking. We map future microgravity research facilities and their governance policies along these dimensions of economic and administrative openness. Using these two metrics, we can rate the overall accessibility of a future marketplace for microgravity research.

Wood: Our goal is to encourage a dialogue about the value of providing access to this unique research environment. Many stakeholders — governments, companies, international organizations — may influence the rules that determine who sends micrgravity research to space after the International Space Station is retired. Thus far, the world has not experienced a microgravity research marketplace that is fully driven by commercial forces with prices set by a free market, because governments have subsidized the cost of research access as a public service. This work highlights the need to evaluate future policy and commercial proposals based on the needs of those that have the least access and experience with microgravity research today.

Q: What type of facility or structure have you found, through your tool, can provide the most affordable access to microgravity research, and what will it take to launch such a model?

Joseph: Although not ideal, our current structure has evolved to become surprisingly accessible. Faciliators like the United Nations Office for Outer Space Affairs help to broker access for emerging space nations by working with some of the “gatekeeper” space agencies that built the ISS. Commercial companies have also started to build and operate their own modules attached to the ISS that almost any user can buy access to. The ISS has become this interesting conglomeration of public, private, commercial, and international entities. So far, none of the other proposals for space stations in low Earth orbit (up to about 2,000 kilometers from the Earth’s surface) are mature enough to determine whether they will have a similar level of accessibility as the current environment.

However, we can always do better. Building the ISS was the single largest and most expensive construction project in human history and it involved effort from many countries. There are a lot of lessons to be learned from the development of the ISS in terms of technical and policy models. We also need to take into account the expectations of the commercial companies that will participate in the emerging commercial space economy in low Earth orbit.

The “spaces in space” that we operate in are evolving dramatically. It is not too early to examine how policies and investment decisions will shape the nature of accessibility for microgravity research beyond the International Space Station. Thinking about accessibility now is important to help ensure that microgravity research remains the province of all humankind.

Proudly powered by WordPress
Theme: Esquire by Matthew Buchanan.