Nine universities team up to create global infrastructure for digital academic credentials

While digital technology has started to transform education by enabling new learning pathways that are customized to each individual’s needs, the way that educational institutions issue and manage academic credentials has not changed much. Nine leading universities announced that they have formed the Digital Credentials collaboration in order to create a trusted, distributed, and shared infrastructure standard for issuing, storing, displaying, and verifying academic credentials.

“Currently, those who successfully complete a degree from an institution must go back to that institution — sometimes by mail or even in person — each time there is a need to verify the academic credentials earned,” says Sanjay Sarma, MIT vice president for open learning. “This can be a complicated problem, especially if the learner no longer has access to the university. Such is the case with many refugees, immigrants, and displaced populations.” 

The universities working on this effort include Delft University of Technology, the Netherlands; Harvard University Division of Continuing Education; the Hasso Plattner Institute, University of Potsdam, Germany; Massachusetts Institute of Technology; Tecnologico de Monterrey, Mexico; Technical University of Munich, Germany; University of California, Berkeley; University of California, Irvine; and the University of Toronto, Canada. 

“As teaching and learning offered by our universities has come to encompass digital platforms, and as each of our learners have gained the power to shape their own educational trajectory over a lifetime, the question of trusted verification and authentication of learning and credentials poses itself with broad urgency,” says Diana Wu, dean of university extension and new academic ventures at UC Berkeley.

Using technology that relies on strong cryptography to prevent tampering and fraud, and shared ledgers to create a global infrastructure for anchoring academic achievements, the researchers plan to build upon earlier research and pioneering efforts by their institutions — including MIT’s pilot program for issuing all of its graduates a digital version of their diploma that is verified against a blockchain. 

One of the driving forces behind this shared effort is the interest by universities to utilize the advances brought by these new technologies in a way that prioritizes the needs of learners. Digital credentials allow learners to maintain a compelling and verifiable digital record of their lifelong learning achievements that may include badges, internships, bootcamps, certificates, MicroMasters (graduate-level courses), and stackable combinations thereof, as well as traditional degrees — all of which they can easily share with employers or other institutions. Institutions can record and manage the achievements of their learners in a way that is easy, safe, and inexpensive, and minimizes the risk of identity fraud.

“We are well-positioned in academia to use cutting-edge technology to empower learners to advance their careers and education with credentials in the palms of their hands,” says Hans Pongratz, senior vice president for IT systems and services at Technical University of Munich.

The team has now set their sights on the evolution and governance of a shared standard. “Digital credentials are like tokens of social and human capital and hold tremendous value for the individual. The crucial opportunity we have today is to bring together institutions that share a commitment to the benefit of learners, and who can act as stewards of this infrastructure,” says Philipp Schmidt, director of learning innovation at the MIT Media Lab. 

“Our shared vision is one where academic achievements, and the corresponding credentials that verify them, can open up new pathways for individuals to become who they want to be in the future,” says José Escamilla, director of TecLabs Learning Reimagined at Tecnologico de Monterrey.

To learn more about this project, visit digitalcredentials.mit.edu.

J-PAL North America’s newest initiative explores the work of the future

“The future of work will be determined by who yields power and for what purposes. We are in a moment of great transition — we have an opportunity to imagine what a new social contract can be,” said Sarita Gupta, executive director of Jobs with Justice and co-director of Caring Across Generations as she kicked off last Friday’s launch of J-PAL North America’s Work of the Future Initiative.

Gupta opened the confercence with a powerful call to action for participants to shift the narrative around “the future of work.” The newest initiative from J-PAL North America, a research center in MIT’s Department of Economics, the Work of the Future Initiative seeks to identify effective, evidence-based strategies that increase opportunity, reduce disparities, and help all workers navigate the work of the future.

Millions of workers throughout the industrialized and developing worlds could be affected by automation, rising inequality, stagnating educational attainment, and other labor market trends in the coming decades. Many workers lack access to jobs that pay living wages, have jobs with insufficient benefits or protections, or lack the necessary skills or education to progress within their industries in the face of technological change.

By spurring research on effective ways to help workers thrive in today’s changing labor market, the Work of the Future Initiative aims to center worker voices and create a more equitable future of work. The conference addressed a number of big questions, including: How can the future of work be made more equitable, efficient, and just?

“J-PAL North America’s Work of the Future Initiative was launched to catalyze rigorous research on these urgent questions,” explained David Autor, the Ford Professor of Economics at MIT and co-chair of the new initiative.

Autor also serves as vice-chair of the Institute’s complementary Work of the Future Task Force, a recently-launched group of MIT faculty and researchers exploring how emerging technologies are changing the nature of human work and what types of education and skills will enable humans to thrive in the digital economy.

The Initiative’s academic leadership, including Autor, co-chairs Matthew Notowidigdo of Northwestern University, and J-PAL Scientific Director Lawrence Katz of Harvard University, recognized that across the country, policymakers, industry leaders, and social service providers are actively seeking solutions to labor market challenges.

Many well-intentioned, potentially effective ideas remain untested, however, leaving policymakers without the necessary evidence to assess what will be helpful, neutral, or harmful. Too often, academic researchers, government agencies, and nonprofit and industry leaders are working on these critical problems in isolation, and don’t have the time or resources to tap into each other’s expertise.

J-PAL’s newest initiative seeks to fill this gap by generating new research to help answer these important questions. It will catalyze this kind of rigorous, actionable evidence through an innovation competition model and a researcher-facing request for proposals (RFP). 

The innovation competition is currently accepting promising research proposals from practitioners across the country, and will work with selected partners to develop a feasible, rigorous evaluation of a program or policy focused on the future of work.

Selected applicants will receive technical support from J-PAL staff, flexible funding to get an evaluation off the ground, and access to J-PAL’s network of leading academic researchers to help them design and implement randomized evaluations of their programs.

Evelyn Diaz, president of Heartland Alliance and a panelist at the kick-off event, explained why this kind of rigorous evaluation is critical to an organization’s success. “There is a fear of failure about evaluation, and we need to change the narrative,” Diaz said. “The focus should instead be on how we are learning.”

Those seeking to learn more about the competition are encouraged to sign up for J-PAL’s informational webinar on June 26. Through the competition, along with a bi-annual, researcher-facing RFP, the initiative aims to generate actionable research on questions related to the future of work.

Meawhile, with conferences like last Friday’s kick-off event, the initiative will also serve as a convener to bring together leading voices in the future of work space. At the kick-off, participants from academic institutions, nonprofits, philanthropies, and the private sector gathered to share insights, learn from each other’s different experiences, and brainstorm solutions to complex research questions.  

Event highlights included a number of engaging, interdisciplinary panels on challenges and opportunities related to the work of the future.

Gupta and Katz, for example, participated in a lively discussion with Abigail Wozniak from the Federal Reserve Bank of Minneapolis on how to shift narratives around the future of work.

Later in the day, Notowidigdo presented key findings from his recent research agenda on the Work of the Future, co-authored by Autor and Northwestern University graduate student Anran Li, and an interdisciplinary panel of industry and nonprofit leaders and academic researchers provided thoughtful commentary on the research agenda.

Jed Kolko, chief economist at Indeed, echoed the review paper’s call for more rigorous research on these topics. “There is a lot of uncertainty about the effect of automation technology on employment. Setting up experiments that will be able to measure those effects is critical.”

David Autor also presented innovative research on how work has — and hasn’t — changed over time, and the implications of this research for worker well-being, and an interdisciplinary panel of researchers and practitioners discussed how they formed mutually beneficial research-practitioner partnerships.

To wrap up the day, J-PAL North America Executive Director Mary Ann Bates moderated a wide-ranging panel on the changing nature of work in the United States that included Katz, J-PAL affiliate Damon Jones, and Julie Gehrki, vice president of philanthropy at the Walmart Foundation.

Bates’ opening remarks on the motivating principle behind the initiative set the tone for the rest of the day’s discussions. “The reason why we care about these topics is because of people,” she said.

Can science writing be automated?

The work of a science writer, including this one, includes reading journal papers filled with specialized technical terminology, and figuring out how to explain their contents in language that readers without a scientific background can understand.

Now, a team of scientists at MIT and elsewhere has developed a neural network, a form of artificial intelligence (AI), that can do much the same thing, at least to a limited extent: It can read scientific papers and render a plain-English summary in a sentence or two.

Even in this limited form, such a neural network could be useful for helping editors, writers, and scientists scan a large number of papers to get a preliminary sense of what they’re about. But the approach the team developed could also find applications in a variety of other areas besides language processing, including machine translation and speech recognition.

The work is described in the journal Transactions of the Association for Computational Linguistics, in a paper by Rumen Dangovski and Li Jing, both MIT graduate students; Marin Soljačić, a professor of physics at MIT; Preslav Nakov, a senior scientist at the Qatar Computing Research Institute, HBKU; and Mićo Tatalović, a former Knight Science Journalism fellow at MIT and a former editor at New Scientist magazine.

From AI for physics to natural language

The work came about as a result of an unrelated project, which involved developing new artificial intelligence approaches based on neural networks, aimed at tackling certain thorny problems in physics. However, the researchers soon realized that the same approach could be used to address other difficult computational problems, including natural language processing, in ways that might outperform existing neural network systems.

“We have been doing various kinds of work in AI for a few years now,” Soljačić says. “We use AI to help with our research, basically to do physics better. And as we got to be  more familiar with AI, we would notice that every once in a while there is an opportunity to add to the field of AI because of something that we know from physics — a certain mathematical construct or a certain law in physics. We noticed that hey, if we use that, it could actually help with this or that particular AI algorithm.”

This approach could be useful in a variety of specific kinds of tasks, he says, but not all. “We can’t say this is useful for all of AI, but there are instances where we can use an insight from physics to improve on a given AI algorithm.”

Neural networks in general are an attempt to mimic the way humans learn certain new things: The computer examines many different examples and “learns” what the key underlying patterns are. Such systems are widely used for pattern recognition, such as learning to identify objects depicted in photos.

But neural networks in general have difficulty correlating information from a long string of data, such as is required in interpreting a research paper. Various tricks have been used to improve this capability, including techniques known as long short-term memory (LSTM) and gated recurrent units (GRU), but these still fall well short of what’s needed for real natural-language processing, the researchers say.

The team came up with an alternative system, which instead of being based on the multiplication of matrices, as most conventional neural networks are, is based on vectors rotating in a multidimensional space. The key concept is something they call a rotational unit of memory (RUM).

Essentially, the system represents each word in the text by a vector in multidimensional space — a line of a certain length pointing in a particular direction. Each subsequent word swings this vector in some direction, represented in a theoretical space that can ultimately have thousands of dimensions. At the end of the process, the final vector or set of vectors is translated back into its corresponding string of words.

“RUM helps neural networks to do two things very well,” Nakov says. “It helps them to remember better, and it enables them to recall information more accurately.”

After developing the RUM system to help with certain tough physics problems such as the behavior of light in complex engineered materials, “we realized one of the places where we thought this approach could be useful would be natural language processing,” says Soljačić,  recalling a conversation with Tatalović, who noted that such a tool would be useful for his work as an editor trying to decide which papers to write about. Tatalović was at the time exploring AI in science journalism as his Knight fellowship project.

“And so we tried a few natural language processing tasks on it,” Soljačić says. “One that we tried was summarizing articles, and that seems to be working quite well.”

The proof is in the reading

As an example, they fed the same research paper through a conventional LSTM-based neural network and through their RUM-based system. The resulting summaries were dramatically different.

The LSTM system yielded this highly repetitive and fairly technical summary: “Baylisascariasis,” kills mice, has endangered the allegheny woodrat and has caused disease like blindness or severe consequences. This infection, termed “baylisascariasis,” kills mice, has endangered the allegheny woodrat and has caused disease like blindness or severe consequences. This infection, termed “baylisascariasis,” kills mice, has endangered the allegheny woodrat.

Based on the same paper, the RUM system produced a much more readable summary, and one that did not include the needless repetition of phrases: Urban raccoons may infect people more than previously assumed. 7 percent of surveyed individuals tested positive for raccoon roundworm antibodies. Over 90 percent of raccoons in Santa Barbara play host to this parasite.

Already, the RUM-based system has been expanded so it can “read” through entire research papers, not just the abstracts, to produce a summary of their contents. The researchers have even tried using the system on their own research paper describing these findings — the paper that this news story is attempting to summarize.

Here is the new neural network’s summary: Researchers have developed a new representation process on the rotational unit of RUM, a recurrent memory that can be used to solve a broad spectrum of the neural revolution in natural language processing.

It may not be elegant prose, but it does at least hit the key points of information.

Çağlar Gülçehre, a research scientist at the British AI company Deepmind Technologies, who was not involved in this work, says this research tackles an important problem in neural networks, having to do with relating pieces of information that are widely separated in time or space. “This problem has been a very fundamental issue in AI due to the necessity to do reasoning over long time-delays in sequence-prediction tasks,” he says. “Although I do not think this paper completely solves this problem, it shows promising results on the long-term dependency tasks such as question-answering, text summarization, and associative recall.”

Gülçehre adds, “Since the experiments conducted and model proposed in this paper are released as open-source on Github, as a result many researchers will be interested in trying it on their own tasks. … To be more specific, potentially the approach proposed in this paper can have very high impact on the fields of natural language processing and reinforcement learning, where the long-term dependencies are very crucial.”

The research received support from the Army Research Office, the National Science Foundation, the MIT-SenseTime Alliance on Artificial Intelligence, and the Semiconductor Research Corporation. The team also had help from the Science Daily website, whose articles were used in training some of the AI models in this research.

Jump-starting the economy with science

In 1988, the U.S. federal government created a $3 billion, 15-year project to sequence the human genome. Not only did the project advance science, it hit the economic jackpot: In 2012, human genome sequencing accounted for an estimated 280,000 jobs, $19 billion in personal income, $3.9 billion in federal taxes, and $2.1 billion in state and local taxes. And all for a price of $2 per year per U.S. resident.

“It’s an incredible rate of return,” says MIT economist Simon Johnson.

It’s not just genomics that pays off. Every additional $10 million in public funding granted to the National Institutes of Health, according to one MIT study, on average produces 2.7 patents and an additional $30 million in value for the private-sector firms that own those patents. When it comes to military technology, each dollar in publicly funded R&D leads to another $2.50-$5.90 in private-sector investment.

In general, “Public investment in science has very big economic returns,” says Johnson, who is the Ronald A. Kurtz Professor of Entrepreneurship at the MIT Sloan School of Management.

Yet after a surge in science funding spurred by World War II, the U.S. has lowered its relative level of public investment in research and development — from about 2 percent of GDP in 1964 to under half of that today.

Reviving U.S. support of science and technology is one of the best ways we can generate economic growth, according to Johnson and his MIT economist colleague Jonathan Gruber, who is the Ford Professor of Economics in MIT’s Department of Economics. And now Johnson and Gruber make that case in a new book, “Jump-Starting America: How Breakthrough Science Can Revive Economic Growth and the American Dream,” published this month by PublicAffairs press.

In it, the two scholars contend that pumping up public investment in science would create not only overall growth but also better jobs throughout the economy, in an era when stagnating incomes have caused strain for a large swath of Americans.

“Good jobs are for MIT graduates, but they’re also for people who don’t finish college. They’re for people who drop out of high school,” says Johnson. “There’s a tremendous amount of anxiety across the country.”

Hello, Columbus

Indeed, spurring growth across the country is a key theme of “Jump-Starting America.” Technology-based growth in the U.S. has been focused in a few “superstar” cities, where high-end tech jobs have been accompanied by increased congestion and sky-high housing prices, forcing out the less well-off.

“The prosperity has been concentrated in some places where it’s become incredibly expensive to live and work,” Johnson says. That includes Silicon Valley, San Francisco, New York, Los Angeles, Seattle, the Washington area, and the Boston metro area.

And yet, Johnson and Gruber believe, the U.S. has scores of cities where the presence of universities combined with industrial know-how could produce more technology-based growth. Some already have: As the authors discuss in the book, Orlando, Florida, is a center for high-tech computer modeling and simulation, thanks to the convergence of federal investment, the growth of the University of Central Florida, and local backing of an adjacent research park that supports dozens of thriving enterprises.

The Orlando case is “a modern version of what once made America the most prosperous nation on Earth,” the authors write, and they believe it can be replicated widely.

“Let’s spread it around the country, to take advantage of where the talent is in the U.S., because there’s a lot of talent away from the coastal cities,” Johnson says.

“Jump-Starting America” even contains a list of 102 metropolitan areas the authors think are ripe for investment and growth, thanks to well-educated work forces and affordability, among other factors. At the top of the list are Pittsburgh, Rochester, and three cities in Ohio: Cincinnati, Columbus, and Cleveland.

The authors’ list does not include any California cities — where affordability is generally a problem — but they view the ranking as a conversation-starter, not the last word on the subject. The book’s website has an interactive feature where readers can tweak the criteria used to rank cities, and see the results.

“We’d like people to challenge us and say, maybe we should think of the criteria differently,” Johnson says. “Everyone should be thinking about what have we got in our region, what do we need to get, and what kind of investment would make the difference here.”

A dividend on your investment

“Jump-Starting America” has received praise from scholars and policy experts. Susan Athey, an economist at Stanford University, calls the book “brilliant” and says it “brings together economic history, urban economics, and the design of incentives to build an ambitious proposal” for growth. Jean Tirole, of the Toulouse School of Economics, says the book gives a boost to industrial policy, by showing “how the government can promote innovation while avoiding the classic pitfalls” of such interventions.

For their part, Johnson and Gruber readily acknowledge that public investment in R&D is just one component of long-term growth. Continued private-sector investment, they note, is vital as well. Still, the book does devote a chapter to the limits of private investment, including the short-term focus on returns that has led many firms to scale back their own R&D operations.

“We’re very pro-private sector,” Johnson says. “I’m a professor of entrepreneurship at Sloan, and I work a lot with entrepreneurs around the world and venture capitalists. They will tell you, quite frankly … their incentives are to make money relatively quickly, given their time horizons and what their investors want. As a result they are drawn to a few sectors, including information technology, and within that more software than hardware these days.”

As a sweetener for any program of public science investment, the authors also suggest that people should receive a kind of annual innovation dividend — a return on their tax dollars. In effect, this would be a scaled-up version of the dividend that, for instance, Alaskans receive on that state’s energy revenues.

That would be a departure from current U.S. policy, but ultimately, Johnson and Gruber say, a departure is what we need.

“We don’t find the existing policies from anyone compelling,” Johnson says. “So we wanted to put some ideas out there and really start a debate about those alternatives, including a return to a bigger investment in science and technology.”

Dress Codes For Women Golfers

There is no doubt that like all sports, golf also has some well-defined dress codes. There are many who might be of the opinion that the dress codes might be a bit tough and strict, especially amongst women. So, very actually does the truth lay hidden? It would be interesting to find out the same. We will try and have a look at the women’s golf apparel so that those who are keen on it could get the required information. it would also be pertinent to mention here that golf dresses over the years have grown and changed with times. Gone are the days, when the dresses were heavy, unwieldy and had covered almost the entire body.

 Today, without being overtly provocative, the focus on the golf dress codes for women is towards comfort, style and also focuses on being trendy and being in line with today’s fashion needs and requirements. The dress code while remaining uniform does have some changes as far as the individual golf organizers are concerned. There are some who are liberal about the dress habits for women golfers while others still continue to be rigid and conservative. However, there are some general rules pertaining to dresses for women golfers and it would be interesting to have a look at a few of them for the sake of our customers and other information seekers.

 Tops

In most golf courses, women are required to wear blouses. It could be both with sleeves and without sleeves. There is no major restriction as far as the colors of the tops are concerned. However, most women prefer wearing polo style shirts. These are extremely comfortable and convenient when it comes to taking shots and moving around in the golf arena. These polo type shirts are available in different colors and designs. They include v-neck, button down and ziptop amongst other styles. They are available both in short as well as long sleeves. Apart from simple colors, you also could come across women golfers wearing them in floral stripes and different types of patterns. You also have other options but there are some exceptions and they include halters, t-shirts, and tank tops.

Sweaters And Jackets

Dressing in layers is now becoming quite common for golfers. Most women golfers wear a sweater or even a vest over a turtleneck or polo shirt. This comes in very handy during a cold day when the barometer is down. It also is common to see them wearing collared button-down shirts and also wear a light golf jacket or even wind shirt. This is useful for additional coverage. However, denim and sweat jackets are not acceptable.

Bottoms

During early fall or springs, slacks are often considered to be the most commonly used dresses by women on the golf course. However, when the weather is warm, then women go in for capris and shorter slacks and shorts and crops could also be tried out. However, the shorter pants should at least be knee high or more failing which they might not be accepted.

Hence, there is no doubt that there are quite a few options and choices to make as far as women golf dresses are concerned.

Contact US:

FlirTee Golf

Address:
3601 NW 175th St
Edmond, OK
Phone: (405) 568-8944

MIT spinout seeks to transform food safety testing

“This is a $10 billion market and everyone knows it.” Those are the words of Chris Hartshorn, CEO of a new MIT spinout — Xibus Systems — that is aiming to make a splash in the food industry with their new food safety sensor.

Hartshorn has considerable experience supporting innovation in agriculture and food technology. Prior to joining Xibus, he served as chief technology officer for Callaghan Innovation, a New Zealand government agency. A large portion of the country’s economy relies upon agriculture and food, so a significant portion of the innovation activity there is focused on those sectors.

While there, Hartshorn came in contact with a number of different food safety sensing technologies that were already on the market, aiming to meet the needs of New Zealand producers and others around the globe. Yet, “every time there was a pathogen-based food recall” he says, “it shone a light on the fact that this problem has not yet been solved.” 

He saw innovators across the world trying to develop a better food pathogen sensor, but when Xibus Systems approached Hartshorn with an invitation to join as CEO, he saw something unique in their approach, and decided to accept.

Novel liquid particles provide quick indication of food contamination

Xibus Systems was formed in the fall of 2018 to bring a fast, easy, and affordable food safety sensing technology to food industry users and everyday consumers. The development of the technology, based on MIT research, was supported by two commercialization grants through the MIT Abdul Latif Jameel Water and Food Systems Lab’s J-WAFS Solutions program. It is based on specialized droplets — called Janus emulsions — that can be used to detect bacterial contamination in food. The use of Janus droplets to detect bacteria was developed by a research team led by Tim Swager, the John D. MacArthur Professor of Chemistry, and Alexander Klibanov, the Novartis Professor of Biological Engineering and Chemistry.

Swager and researchers in his lab originally developed the method for making Janus emulsions in 2015. Their idea was to create a synthetic particle that has the same dynamic qualities as the surface of living cells. 

The liquid droplets consist of two hemispheres of equal size, one made of a blue-tinted fluorocarbon and one made of a red-tinted hydrocarbon. The hemispheres are of different densities, which affects how they align and how opaque or transparent they appear when viewed from different angles. They are, in effect, lenses. What makes these micro-lenses particularly unique, however, is their ability to bind to specific bacterial proteins. Their binding properties enabled them to move, flipping from a red hemisphere to blue based on the presence or absence of a particular bacteria, like Salmonella.

“We were thrilled by the design,” Swager says. “It is a completely new sensing method that could really transform the food safety sensing market. It showed faster results than anything currently available on the market, and could still be produced at very low cost.”

Janus emulsions respond exceptionally quickly to contaminants and provide quantifiable results that are visible to the naked eye or can be read via a smartphone sensor. 

“The technology is rooted in very interesting science,” Hartshorn says. “What we are doing is marrying this scientific discovery to an engineered product that meets a genuine need and that consumers will actually adopt.”

Having already secured nearly $1 million in seed funding from a variety of sources, and also being accepted into Sprout, a highly respected agri-food accelerator, they are off to a fast start.

Solving a billion-dollar industry challenge

Why does speed matter? In the field of food safety testing, the standard practice is to culture food samples to see if harmful bacterial colonies form. This process can take many days, and often can only be performed offsite in a specialized lab.

While more rapid techniques exist, they are expensive and require specialized instruments — which are not widely available — and still typically require 24 hours or more from start to finish. In instances where there is a long delay between food sampling and contaminant detection, food products could have already reached consumers hands — and upset their stomachs. While the instances of illness and death that can occur from food-borne illness are alarming enough, there are other costs as well.  Food recalls result in tremendous waste, not only of the food products themselves but of the labor and resources involved in their growth, transportation, and processing. Food recalls also involve lost profit for the company. North America alone loses $5 billion annually in recalls, and that doesn’t count the indirect costs associated with the damage that occurs to particular brands, including market share losses that can last for years.

The food industry would benefit from a sensor that could provide fast and accurate readings of the presence and amount of bacterial contamination on-site. The Swager Group’s Janus emulsion technology has many of the elements required to meet this need and Xibus Systems is working to improve the speed, accuracy, and overall product design to ready the sensor for market.

Two other J-WAFS-funded researchers have helped improve the efficiency of early product designs. Mathias Kolle, assistant professor in the Department of Mechanical Engineering at MIT and recipient of a separate 2017 J-WAFS seed grant, is an expert on optical materials. In 2018, he and his graduate student Sara Nagelberg performed the calculations describing light’s interaction with the Janus particles so that Swager’s team could modify the design and improve performance. Kolle continues to be involved, serving with Swager on the technical advisory team for Xibus. 

This effort was a new direction for the Swager group. Says Swager: “The technology we originally developed was completely unprecedented. At the time that we applied to for a J-WAFS Solutions grant, we were working in new territory and had minimal preliminary results. At that time, we would have not made it through, for example,  government funding reviews which can be conservative. J-WAFS sponsorship of our project at this early stage was critical to help us to achieve the technology innovations that serve as the foundation of this new startup.”  

Xibus co-founder Kent Harvey — also a member of the original MIT research team—is joined by Matthias Oberli and Yuri Malinkevich. Together with Hartshorn they are working on a prototype for initial market entry. They are actually developing two different products: a smartphone sensor that is accessible to everyday consumers, and a portable handheld device that is more sensitive and would be suitable for industry. If they are able to build a successful platform that meets industry needs for affordability, accuracy, ease of use, and speed, they could apply that platform to any situation where a user would need to analyze organisms that live in water. This opens up many sectors in the life sciences, including water quality, soil sensing, veterinary diagnostics, as well as fluid diagnostics for the broader healthcare sector.    

The Xibus team wants to nail their product right off the bat.

“Since food safety sensing is a crowded field, you only get one shot to impress your potential customers,“ Hartshorn says. “If your first product is flawed or not interesting enough, it can be very hard to open the door with these customers again. So we need to be sure our prototype is a game-changer. That’s what’s keeping us awake at night.” 

The evolving definition of a gene

More than 50 years ago, scientists came up with a definition for the gene: a sequence of DNA that is copied into RNA, which is used as a blueprint for assembling a protein.

In recent years, however, with the discovery of ever more DNA sequences that play key roles in gene expression without being translated into proteins, this simple definition needed revision, according to Gerald Fink, the Margaret and Herman Sokol Professor in Biomedical Research and American Cancer Society Professor of Genetics in MIT’s Department of Biology.

Fink, a pioneer in the field of genetics, discussed the evolution of this definition during yesterday’s James R. Killian Jr. Faculty Achievement Award Lecture, titled, “What is a Gene?”

“In genetics, we’ve lost a simple definition of the gene — a definition that lasted over 50 years,” he said. “But loss of the definition has spawned whole new fields trying to understand the unknown information in non-protein-coding DNA.”

Established in 1971 to honor MIT’s 10th president, James Killian, the Killian Award recognizes extraordinary professional achievements by an MIT faculty member. Fink, who is also a member and former director of the Whitehead Institute, was honored for his achievements in developing brewer’s yeast as “the premier model for understanding the biology of eukaryotes” — organisms whose cells have nuclei.

“He is among the very few scientists who can be singularly credited with fundamentally changing the way we approach biological problems,” says the award citation, read by Susan Silbey, chair of the MIT faculty, who presented Fink with the award.

Genetic revolution

Growing in a “sleepy” town on Long Island, Fink had a keen interest in science, which spiked after the Soviets launched the first satellite to orbit the Earth.

“In 1957, when I went out in our backyard, I was hypnotized by the new star in the sky, as Sputnik slowly raced toward the horizon,” he said. “Overnight, science became a national priority, energized by the dread of Soviet technology and technological superiority.”

After earning his bachelor’s degree at Amherst College, Fink began studying yeast as a graduate student at Yale University, and in 1976, he developed a way to insert any DNA sequence into yeast cells.

This discovery transformed biomedical research by allowing scientists to program yeast to produce any protein they wanted, as long as they knew the DNA sequence of the gene that encoded it. It also proved industrially useful: More than half of all therapeutic insulin is now produced by yeast, along with many other drugs and vaccines, as well as biofuels such as ethanol.

At that time, scientists were operating with a straightforward definition of the gene, based on the “central dogma” of biology: DNA makes RNA, and RNA makes proteins. Therefore, a gene was defined as a sequence of DNA that could code for a protein. This was convenient because it allowed computers to be programmed to search the genome for genes by looking for specific DNA sequences bracketed by codons that indicate the starting and stopping points of a gene.

In recent decades, scientists have done just that, identifying about 20,000 protein-coding genes in the human genome. They have also discovered genetic mechanisms involved in thousands of human diseases. Using new tools such as CRISPR, which enables genome editing, cures for such diseases may soon be available, Fink believes.

“The definition of a gene as a DNA sequence that codes for a protein, coupled with the sequencing of the human genome, has revolutionized molecular medicine,” he said. “Genome sequencing, along with computational power to compare and analyze genomes, has led to important insights into basic science and disease.”

However, he pointed out, protein-coding genes account for just 2 percent of the entire human genome. What about the rest of it? Scientists have traditionally referred to the remaining 98 percent as “junk DNA” that has no useful function.

In the 1980s, Fink began to suspect that this junk DNA was not as useless as had been believed. He and others discovered that in yeast, certain segments of DNA could “jump” from one location to another, and that these segments appeared to regulate the expression of whatever genes were nearby. This phenomenon was later observed in human cells as well.

“That alerted me and others to the fact that ‘junk DNA’ might be making RNA but not proteins,” Fink said.

Since then, scientists have discovered many types of non-protein-coding RNA molecules, including microRNAs, which can block the production of proteins, and long non-coding RNAs (lncRNAs), which have many roles in gene regulation.

“In the last 15 years, it has been found that these are critical for controlling the gene expression of protein-coding genes,” Fink said. “We’re only now beginning to visualize the importance of this formerly invisible part of the genome.”

Such discoveries demonstrate that the traditional definition of a gene is inadequate to encompass all of the information stored in the genome, he said.

“The existence of these diverse classes of RNA is evidence that there is no single physical and functional unit of heredity that we can call the gene,” he said. “Rather, the genome contains many different categories of informational units, each of which may be considered a gene.”

“A community of scholars”

In selecting Fink for the Killian Award, the award committed also cited his contributions to the founding of the Whitehead Institute, which opened in 1982. At the time, forming a research institute that was part of MIT yet also its own entity was considered a “radical experiment,” Fink recalled.

Though controversial at the time, with heated debate among the faculty, establishing the Whitehead Institute laid the groundwork for many other research institutes that have been established at MIT, and also helped to attract biotechnology companies to the Kendall Square area, Fink said.

“As we now know, MIT made the right decision. The Whitehead turned out to be a successful pioneer experiment that in my opinion led to the blossoming of the Kendall Square area,” he said.

Fink was hired as one of the first faculty members of the Whitehead Institute, and served as its director from 1990 to 2001, when he oversaw the Whitehead’s contributions to the Human Genome Project. He recalled that throughout his career, he has collaborated extensively not only with other biologists, but with MIT colleagues in fields such as physics, chemical engineering, and electrical engineering and computer science.

“MIT is a community of scholars, and I was welcomed into the community,” he said.

Letter regarding the MIT Schwarzman College of Computing community forums in April

The following letter was sent to the MIT community on April 4 by Provost Martin A. Schmidt.

To the members of the MIT community:

I write to invite you to a series of three community forums with the MIT Stephen A. Schwarzman College of Computing Working Groups on April 17 and 18.

As you may recall, in February five working groups were charged with developing ideas and options for the MIT administration to consider in planning the structure and operation of the new College, with the intent of completing this work in May. The forums will provide updates on the working groups’ discussions to date, and give you an opportunity to share questions, thoughts, and ideas with the working groups co-chairs leading the discussions. 

Because some of the content that the working groups are exploring is overlapping, we have structured the forums as follows:

  1. Joint forum with the Social Implications and Responsibilities of Computing and Academic Degrees working groups
    Wednesday, April 17, 10:30 am–12:00 pm
    Kresge Little Theatre (W16-035)
     
  2. Joint forum with the Organizational Structure and Faculty Appointments working groups
    Wednesday, April 17, 1:00–2:30 pm
    Kresge Little Theatre (W16-035)
     
  3. Computing Infrastructure working group forum
    Thursday, April 18, 10:30 am–11:30 am
    Samberg Dining Rooms 3 and 4 (E52, 6th floor)

Please visit the MIT Schwarzman College of Computing Task Force website for more information on the working groups and to contribute your suggestions to the Idea Bank.

Your participation and input are important. The working groups and I hope that you will attend any or all of these forums.

I also want to note that, beyond these forums, we will be planning additional opportunities for community engagement going forward as we plan for the new College.

Sincerely,

Martin A. Schmidt

KSA meeting explores collaboration in 2019

Speakers at the Kendall Square Association (KSA) annual meeting yesterday considered the ways collaboration in our society is changing and gave updates on their work to strengthen the bonds between members of the local community.

Many organizations champion the concept of collaboration, but this year the KSA wanted to take a closer look at the influence modern technology has had on collaboration between people, organizations, and societies.

“We all talk about collaboration — it’s kind of obvious — but there are new ways to think about collaboration that can help us work more deeply with one another and get things done more quickly,” says Sarah Gallop, who serves as chair of the board at the KSA and co-director of MIT’s Office of Government and Community Relations. “We’re not talking about superficial collaboration here. We’re talking about intense collaboration, with depth, that allows us to do more by working together.”

The KSA is a nonprofit organization made up of industry, government, and academic officials, which seeks to build partnerships, host events, advocate for public policy issues, and tell the story of Kendall Square’s transformation. More than 200 people from a wide range of companies and organizations attended this year’s meeting, which was hosted by event sponsor Boston Marriot Cambridge.

The meeting also included talks on the KSA’s Diversity, Equity, and Inclusion Initiative; efforts to improve commuting options; and recent accomplishments in the bustling Kendall Square area that many at the meeting referred to as the most innovative square mile in the world.

Collaboration in 2019

The meeting’s keynote speaker was Jeremy Heimans, co-author of the recently released book “New Power: How Power Works in Our Hyperconnected World — and How to Make It Work for You”. In his talk, Heimans described the new power structures enabled by technology, citing examples such as the #MeToo movement’s effectiveness at holding previously untouchable company executives accountable for sexual misconduct, and the fact that Airbnb is currently more valuable than Hilton.

He described new power as current, open, and made by many people, and he challenged attendees to start thinking about when and where to turn to these new power dynamics in their organizations.

Gallop says the talk was a good way to help KSA members collectively develop new ways of thinking.

“It’s so important for us to learn together,” Gallop says. “Each year, we bring in a speaker that’s going to help us broaden our minds, broaden our thinking, and make us think about current issues. Jeremy is teaching us how to make things happen and how to accelerate our work even more.”

During a Q&A after his talk, Heimans told members of the audience that they have an important role to play in this new power landscape, defending science and reason in a society that’s less hierarchical than ever before and where knowledgeable voices can more easily get drowned out.

“Expertise is under assault right now,” Heimans said. “We need you. You’re already densely connected, so experiment with ways to further unleash that. What can you do to increase collaboration, tackle these issues, and spread what you do so well — innovation — around the world?”

A busy 10 years

The KSA has long been focused on the issues of connectivity, transportation, and diversity and inclusion. Gallop says those goals make MIT’s close partnership with the organization easy.

“I have always felt that the KSA and MIT have very similar values in terms of our missions around learning, being inclusive, and sharing information,” Gallop said. “For me, as an MIT employee and chair of the KSA board, there’s a lot of synergy between the two organizations that makes it a natural fit.”

Last year, the KSA assembled a Diversity and Inclusion Learning Community to get leaders brainstorming ways to promote diversity in the area. Since then, they’ve hired an expert facilitator to guide the group, and they are currently assembling their second cohort of leaders in the area.

The KSA also announced a series of design-thinking workshops and meetings to be held throughout April as part of its placemaking initiative. The KSA defines placemaking as “a collaborative process to better define, activate, and program public space.” To help with the initiative, the KSA assembled a working group of professionals with placemaking expertise that includes MIT faculty members, urban planners, real estate developers, and community and business leaders.

“We want to optimize conditions for connectivity and collaboration,” said Jesse Baerkahn, founder of urban design firm Graffito SP and a member of the KSA’s board of directors. “We’ll do that by injecting play, fun, and beauty into the area that reflects the magic and uniqueness that makes Kendall Square what it is today.”

Improving commuting times and safety for Kendall Square’s workers is another area where the KSA has been active over the last year. Members of the KSA meet regularly with the MBTA and the Massachusetts Department of Transportation, and have been working closely with other advocacy groups like T4MA and A Better City over the past year.

In October, the KSA launched the Transportation ADVANCE Initiative to engage the Kendall Square community through a series of experiments to find hyper-local solutions to improving workers’ commutes.

Of course, the KSA’s 10th anniversary also offered an opportunity to reflect on how much Kendall Square and the KSA have evolved over the years. The area’s past as an industrial wasteland has been well-documented, but attendees also discussed the more recent transformation of the last 10 years.

“Everywhere I go, people talk about how Kendall Square is changing, with new buildings, restaurants, gathering spaces, people, inventions, and companies,” Gallop said in her opening remarks. “Sometimes, I have to take a minute when I’m walking around because a building I swear wasn’t there last week is all of a sudden completed in the square. But Kendall has always been about change.”

In a manner similar to the area, Gallop says the KSA’s priorities are always evolving.

“Our priorities may change; the beauty of Kendall Square is that it’s always changing,” Gallop says. “If you come back in a few years there might be a new priority area — or three new ones.”

The power of play

A new collaboration between MIT.nano and NCSOFT, a video game development company based in South Korea, will seek to chart the future of how people interact with the world and each other via the latest science and technology for gaming.

The creation of the MIT.nano Immersion Lab Gaming Program, says Brian W. Anthony, the associate director of MIT.nano, arose out of a series of discussions with NCSOFT, a founding member of the MIT.nano industry consortium. “How do we apply the language of gaming to technology research and education?” Anthony asks. “How can we use its tools — and develop new ones — to connect different domains, give people better ways to collaborate, and improve how we communicate and interact?” 

As part of the collaboration, NCSOFT will provide funding to acquire hardware and software tools to outfit the MIT.nano Immersion Lab, the facility’s research and collaboration space for investigations in artificial intelligence, virtual and augmented reality, artistic projects, and other explorations at the intersection of hard tech and human beings. The program, set to run for four years, will also offer annual seed grants to MIT faculty and researchers for projects in science, technology, and applications of:

  • gaming in research and education;
  • communication paradigms;
  • human-level inference; and
  • data analysis and visualization.

A mini-workshop to bring together MIT principle investigators and NCSOFT technology representatives will be held at MIT.nano on April 25.

Anthony, who is also faculty lead for the MechE Alliance Industry Immersion Projects and principal research scientist in the Department of Mechanical Engineering and the Institute for Medical Engineering and Science, says the collaboration will support projects that explore data collection for immersive environments, novel techniques for visualization, and new approaches to motion capture, sensors, and mixed-reality interfaces. 

Collaborating with a gaming company, he says, comes with intriguing opportunities for new research and education paradigms. Specific topics the seed-grant program hopes to prompt include improved detection and reduction of dizziness associated with immersive headsets; automatic voice generation based on the appearance or emotional states of virtual characters; and reducing the cost and improving the accuracy of marker-less, video-based motion capture in open space. 

Another area of interest, Anthony says, is the development of new tools and techniques for detecting and learning how personal gestures communicate intent. “I gesticulate a lot when I’m talking,” he says. “What does it mean, and how do we appropriately capture and model that?” And what if, he adds, researchers could apply what they learn to empower a doctor in a clinical setting? “Maybe the doctor wants to palpate an image of tissue, which includes measures of elasticity, to see how it deforms or slice it to see how it responds. Tools that can record and visualize these gestures and reactions could be tremendously powerful.”

The collaboration’s work will involve the creation of software and the development of new hardware. The Immersion Lab Gaming Program’s leaders hope to spur a range of efforts to explore visual, acoustic, and haptic technologies. For example, Anthony says, it is possible to use sound waves that are outside the range of what is audible to humans for tactile purposes. Expressed with the right tools, these waves can actually be touched in mid-air.

The combination of hardware and software development was one of the key factors in NCSOFT’s decision to work with MIT and to become one of MIT.nano’s founding industry partners, Anthony says. And the third-floor Immersion Lab — specifically designed to facilitate immersive digital experiences — provides a flexible platform for the new program.

“Most of this building is about making or visualizing physical things at the nanometer scale,” Anthony adds. “But this space will be a magnet for people who do data-science research. We want to use the Immersion Lab to increase friction between the hardware and software folks — and I mean friction in a good way, like rubbing elbows — and to interact with the data coming from the building and to imagine the new hardware required to better interact with data, which can then be made in MIT.nano.”

MIT.nano has released the program’s first call for proposals. Applications are due May 25. 

Proudly powered by WordPress
Theme: Esquire by Matthew Buchanan.