People Should Find A Safe Storm Shelter During Thunderstorm

Storm Shelters in OKC

Tuesday June 5, 2001 marked the start of an extremely fascinating time in the annals of my cherished Houston. Tropical storm Allison, that early summer daytime came to see. The thunderstorm went rapidly, although there was Tuesday. Friday, afterward arrived, and Allison returned. This time going slowly, this time in the north. The thunderstorm became still. Thousands of people driven from their houses. Only when they might be desired most, several leading hospitals shut. Dozens of important surface roads, and every important highway covered in water that was high.

Yet even prior to the rain stopped, service to others, and narratives of Christian compassion started to be composed. For a couples class, about 75 people had assembled at Lakewood Church among the greatest nondenominational churches in The United States. From time they got ready to depart the waters had climbed so high they were stranded. The facility of Lakewood stayed dry and high at the center of among the hardest hit parts of town. Refugees in the powerful thunderstorm started arriving at their doorstep. Without no advance preparation, and demand of official sanction, those 75 classmates started a calamity shelter that grew to hold over 3,000 customers. The greatest of over 30 refuges that could be established in the height of the thunderstorm.

Where help was doled out to those who’d suffered losses after Lakewood functioned as a Red Cross Service Center. When it became clear that FEMA aid, and Red Cross wouldn’t bring aid enough, Lakewood and Second Baptist joined -Houston to produce an adopt a family plan to greatly help get folks on their feet quicker. In the occasions that followed militaries of Christians arrived in both churches. From all over town, people of economical standing, race, and each and every denomination collected. Wet rotted carpeting were pulled up, sheet stone removed. Piles of clothes donated food and bed clothes were doled out. Elbow grease and cleaning equipment were used to start eliminating traces of the damage.

It would have been an excellent example of practical ministry in a period of disaster, in the event the story stopped here, but it continues. A great many other churches functioned as shelters as well as in the occasions that followed Red Cross Service Centers. Tons of new volunteers, a lot of them Christians put to work, and were put through accelerated training. That Saturday, I used to be trapped in my own, personal subdivision. Particular that my family was safe because I worked in Storm Shelters OKC that was near where I used to live. What they wouldn’t permit the storm to do, is take their demand to give their religion, or their self respect. I saw so a lot of people as they brought gifts of food, clothes and bedclothes, praising the Lord. I saw young kids coming making use of their parents to not give new, rarely used toys to kids who had none.

Leaning On God Through Hard Times

Unity Church of Christianity from a location across town impacted by the storm sent a sizable way to obtain bedding as well as other supplies. A tiny troupe of musicians and Christian clowns requested to be permitted to amuse the kids in the shelter where I served and arrived. We of course promptly taken their offer. The kids were collected by them in a sizable empty space of flooring. They sang, they told stories, balloon animals were made by them. The kids, frightened, at least briefly displaced laughed.

When not occupied elsewhere I did lots of listening. I listened to survivors that were disappointed, and frustrated relief workers. I listened to kids make an effort to take advantage of a scenario they could not comprehend. All these are only the stories I have heard or seen. I am aware that spiritual groups, Churches, and lots of other individual Christians functioned admirably. I do need to thank them for the attempts in disaster. I thank The Lord for supplying them to serve.

I didn’t write its individuals, or this which means you’d feel sorry for Houston. As this disaster unfolded yet what I saw encouraged my beliefs the Lord will provide through our brothers and sisters in religion for us. Regardless how awful your community hits, you the individual Christian can be a part of the remedy. Those blankets you can probably never use, and have stored away mean much to people who have none. You are able to help in the event that you can drive. You are able to help if you’re able to create a cot. It is possible to help in the event that you can scrub a wall. It is possible to help if all you are able to do is sit and listen. Large catastrophes like Allison get lots of focus. However a disaster can come in virtually any size. That is a serious disaster to your family that called it home in case a single household burns. It is going to be generations prior to the folks here forget Allison.

United States Oil and Gas Exploration Opportunities

Firms investing in this sector can research, develop and create, as well as appreciate the edges of a global gas and oil portfolio with no political and economical disadvantages. Allowing regime and the US financial conditions is rated amongst the world and the petroleum made in US is sold at costs that were international. The firms will likely gain as US also has a national market that is booming. Where 500 exploration wells are drilled most of the petroleum exploration in US continues to be concentrated around the Taranaki Basin. On the other hand, the US sedimentary basins still remain unexplored and many show existence of petroleum seeps and arrangements were also unveiled by the investigation data with high hydrocarbon potential. There have already been onshore gas discoveries before including Great south river basins, East Coast Basin and offshore Canterbury.

As interest in petroleum is expected to grow strongly during this interval but this doesn’t automatically dim the bright future expectations in this sector. The interest in petroleum is anticipated to reach 338 PJ per annum. The US government is eager to augment the gas and oil supply. As new discoveries in this sector are required to carry through the national demand at the same time as raise the amount of self reliance and minimize the cost on imports of petroleum the Gas and Oil exploration sector is thought to be among the dawn sectors. The US government has invented a distinctive approach to reach its petroleum and gas exploration targets. It’s developed a “Benefit For Attempt” model for Petroleum and Gas exploration tasks in US.

The “Benefit For Attempt” in today’s analytic thinking is defined as oil reserves found per kilometer drilled. It will help in deriving the estimate of reservations drilled for dollar and each kilometer spent for each investigation. The authorities of US has revealed considerable signs that it’ll bring positive effects of change which will favor investigation of new oil reserves since the price of investigation has adverse effects on investigation task. The Authorities of US has made the information accessible about the oil potential in its study report. Foil of advice in royalty and allocation regimes, and simplicity of processes have enhanced the attractiveness of Petroleum and Natural Gas Sector in the United States.

Petroleum was the third biggest export earner in 2008 for US and the chance to to keep up the growth of the sector is broadly accessible by manners of investigation endeavors that are new. The government is poised to keep the impetus in this sector. Now many firms are active with new exploration jobs in the Challenger Plateau of the United States, Northland East Slope Basin region, outer Taranaki Basin, and Bellona Trough region. The 89 Energy oil and gas sector guarantees foreign investors as government to high increase has declared a five year continuance of an exemption for offshore petroleum and gas exploration in its 2009 budget. The authorities provide nonresident rig operators with tax breaks.

Modern Robot Duct Cleaning Uses

AC systems, and heat, venting collect pollutants and contaminants like mold, debris, dust and bacteria that can have an adverse impact on indoor air quality. Most folks are at present aware that indoor air pollution could be a health concern and increased visibility has been thus gained by the area. Studies have also suggested cleaning their efficacy enhances and is contributory to a longer operating life, along with maintenance and energy cost savings. The cleaning of the parts of forced air systems of heat, venting and cooling system is what’s called duct cleaning. Robots are an advantageous tool raising the price and efficacy facets of the procedure. Therefore, using modern robot duct isn’t any longer a new practice.

A cleaner, healthier indoor environment is created by a clean air duct system which lowers energy prices and increases efficiency. As we spend more hours inside air duct cleaning has become an important variable in the cleaning sector. Indoor pollutant levels can increase. Health effects can show years or up immediately after repeated or long exposure. These effects range from some respiratory diseases, cardiovascular disease, and cancer that can be deadly or debilitating. Therefore, it’s wise to ensure indoor air quality isn’t endangered inside buildings. Dangerous pollutants that can found in inside can transcend outdoor air pollutants in accordance with the Environmental Protection Agency.

Duct cleaning from Air Duct Cleaning Edmond professionals removes microbial contaminants, that might not be visible to the naked eye together with both observable contaminants. Indoor air quality cans impact and present a health hazard. Air ducts can be host to a number of health hazard microbial agents. Legionnaires Disease is one malaise that’s got public notice as our modern surroundings supports the development of the bacteria that has the potential to cause outbreaks and causes the affliction. Typical disorder-causing surroundings contain wetness producing gear such as those in air conditioned buildings with cooling towers that are badly maintained. In summary, in building and designing systems to control our surroundings, we’ve created conditions that were perfect . Those systems must be correctly tracked and preserved. That’s the secret to controlling this disorder.

Robots allow for the occupation while saving workers from exposure to be done faster. Signs of the technological progress in the duct cleaning business is apparent in the variety of gear now available for example, array of robotic gear, to be used in air duct cleaning. Robots are priceless in hard to reach places. Robots used to see states inside the duct, now may be used for spraying, cleaning and sampling procedures. The remote controlled robotic gear can be fitted with practical and fastener characteristics to reach many different use functions.

Video recorders and a closed circuit television camera system can be attached to the robotic gear to view states and operations and for documentation purposes. Inside ducts are inspected by review apparatus in the robot. Robots traveling to particular sections of the system and can move around barriers. Some join functions that empower cleaning operation and instruction manual and fit into little ducts. An useful view range can be delivered by them with models delivering disinfection, cleaning, review, coating and sealing abilities economically.

The remote controlled robotic gear comes in various sizes and shapes for different uses. Of robotic video cameras the first use was in the 80s to record states inside the duct. Robotic cleaning systems have a lot more uses. These devices provide improved accessibility for better cleaning and reduce labor costs. Lately, functions have been expanded by areas for the use of small mobile robots in the service industries, including uses for review and duct cleaning.

More improvements are being considered to make a tool that was productive even more effective. If you determine to have your ventilation, heat and cooling system cleaned, it’s important to make sure all parts of the system clean and is qualified to achieve this. Failure to clean one part of a contaminated system can lead to re-contamination of the entire system.

When To Call A DWI Attorney

Charges or fees against a DWI offender need a legal Sugar Land criminal defense attorney that is qualified dismiss or so that you can reduce charges or the fees. So, undoubtedly a DWI attorney is needed by everyone. Even if it’s a first-time violation the penalties can be severe being represented by a DWI attorney that is qualified is vitally significant. If you’re facing following charges for DWI subsequently the punishments can contain felony charges and be severe. Locating an excellent attorney is thus a job you should approach when possible.

So you must bear in mind that you just should hire a DWI attorney who practices within the state where the violation occurred every state within America will make its laws and legislation regarding DWI violations. It is because they are going to have the knowledge and expertise of state law that is relevant to sufficiently defend you and will be knowledgeable about the processes and evaluations performed to establish your guilt.

As your attorney they are going to look to the evaluations that have been completed at the time of your arrest and the authorities evidence that is accompanying to assess whether or not these evaluations were accurately performed, carried out by competent staff and if the right processes where followed. It isn’t often that a police testimony is asserted against, although authorities testimony also can be challenged in court.

You should attempt to locate someone who specializes in these kind of cases when you start trying to find a DWI attorney. Whilst many attorneys may be willing to consider on your case, a lawyer who specializes in these cases is required by the skilled knowledge needed to interpret the scientific and medical evaluations ran when you had been detained. The first consultation is free and provides you with the chance to to inquire further about their experience in fees and these cases.

Many attorneys will work according into a fee that is hourly or on a set fee basis determined by the kind of case. You may find how they have been paid to satisfy your financial situation and you will have the capacity to negotiate the conditions of their fee. If you are unable to afford to hire an attorney that is private you then can request a court-appointed attorney paid for by the state. Before you hire a DWI attorney you should make sure when you might be expected to appear in court and you understand the precise charges imposed against you.

How Credit Card Works

The credit card is making your life more easy, supplying an amazing set of options. The credit card is a retail trade settlement; a credit system worked through the little plastic card which bears its name. Regulated by ISO 7810 defines credit cards the actual card itself consistently chooses the same structure, size and contour. A strip of a special stuff on the card (the substance resembles the floppy disk or a magnetic group) is saving all the necessary data. This magnetic strip enables the credit card’s validation. The layout has become an important variable; an enticing credit card layout is essential in ensuring advice and its dependability keeping properties.

A credit card is supplied to the user just after a bank approves an account, estimating a varied variety of variables to ascertain fiscal dependability. This bank is the credit supplier. When a purchase is being made by an individual, he must sign a receipt to verify the trade. There are the card details, and the amount of cash to be paid. You can find many shops that take electronic authority for the credit cards and use cloud tokenization for authorization. Nearly all verification are made using a digital verification system; it enables assessing the card is not invalid. If the customer has enough cash to insure the purchase he could be attempting to make staying on his credit limit any retailer may also check.

As the credit supplier, it is as much as the banks to keep the user informed of his statement. They typically send monthly statements detailing each trade procedures through the outstanding fees, the card and the sums owed. This enables the cardholder to ensure all the payments are right, and to discover mistakes or fraudulent action to dispute. Interest is typically charging and establishes a minimal repayment amount by the end of the following billing cycle.

The precise way the interest is charged is normally set within an initial understanding. On the rear of the credit card statement these elements are specified by the supplier. Generally, the credit card is an easy type of revolving credit from one month to another. It can also be a classy financial instrument, having many balance sections to afford a greater extent for credit management. Interest rates may also be not the same as one card to another. The credit card promotion services are using some appealing incentives find some new ones along the way and to keep their customers.

Why Get Help From A Property Management?

One solution while removing much of the anxiety, to have the revenue of your rental home would be to engage and contact property management in Oklahoma City, Oklahoma. If you wish to know more and are considering the product please browse the remainder of the post. Leasing out your bit of real property may be real cash-cow as many landlords understand, but that cash flow usually includes a tremendous concern. Night phones from tenants that have the trouble of marketing the house if you own an emptiness just take out lots of the pleasure of earning money off of leases, overdue lease payments which you must chase down, as well as over-flowing lavatories. One solution while removing much of the anxiety, to have the earnings would be to engage a property management organization.

These businesses perform as the go between for the tenant as well as you. The tenant will not actually need to understand who you’re when you hire a property management company. The company manages the day to day while you still possess the ability to help make the final judgements in regards to the home relationships using the tenant. The company may manage the marketing for you personally, for those who are in possession of a unit that is vacant. Since the company is going to have more connections in a bigger market than you’ve got along with the industry than you are doing, you’ll discover your device gets stuffed a whole lot more quickly making use of their aid. In addition, the property management company may care for testing prospective tenants. With regards to the arrangement you’ve got, you might nevertheless not be unable to get the last say regarding if a tenant is qualified for the the system, but of locating a suitable tenant, the day-to-day difficulty is not any longer your problem. They’ll also manage the before-move-in the reviews as well as reviews required following a tenant moves away.

It is possible to step back watching the profits, after the the system is stuffed. Communicating will be handled by the company with all the tenant if you have an issue. You won’t be telephoned if this pipe explosions at the center of the night time. Your consultant is called by the tenant in the company, who then makes the preparations that are required to get the issue repaired with a care supplier. You get a phone call a day later or may not know there was an issue before you register using the business. The property management organization may also make your leasing obligations to to get. The company will do what’s required to accumulate if your tenant is making a payment. In certain arrangements, the organization is going to also take-over paying taxation, insurance, and the mortgage on the portion of property. You actually need to do-nothing but appreciate after after all the the invoices are paid, the revenue which is sent your way.

With all the advantages, you’re probably questioning exactly what to employing a property management organization, the downside should be. From hiring one the primary variable that stops some landlords is the price. All these providers will be paid for by you. The price must be weighed by you from the time frame you’ll save time that you may subsequently use to follow additional revenue-producing efforts or just take pleasure in the fruits of your expense work.

Benifits From An Orthodontic Care

Orthodontics is the specialty of dentistry centered on the identification and treatment of dental and related facial problems. The outcomes of Norman Orthodontist OKC treatment could be dramatic — an advanced quality of life for a lot of individuals of ages and lovely grins, improved oral health health, aesthetics and increased cosmetic tranquility. Whether into a look dentistry attention is needed or not is an individual’s own choice. Situations are tolerated by most folks like totally various kinds of bite issues or over bites and don’t get treated. Nevertheless, a number people sense guaranteed with teeth that are correctly aligned, appealing and simpler. Dentistry attention may enhance construct and appearance power. It jointly might work with you consult with clearness or to gnaw on greater.

Orthodontic attention isn’t only decorative in character. It might also gain long term oral health health. Right, correctly aligned teeth is not more difficult to floss and clean. This may ease and decrease the risk of rot. It may also quit periodontists irritation that problems gums. Periodontists might finish in disease, that occurs once micro-organism bunch round your house where the teeth and the gums meet. Periodontists can be ended in by untreated periodontists. Such an unhealthiness result in enamel reduction and may ruin bone that surrounds the teeth. Less may be chewed by people who have stings that are harmful with efficacy. A few of us using a serious bite down side might have difficulties obtaining enough nutrients. Once the teeth aren’t aimed correctly, this somewhat might happen. Morsel issues that are repairing may allow it to be more easy to chew and digest meals.

One may also have language problems, when the top and lower front teeth do not arrange right. All these are fixed through therapy, occasionally combined with medical help. Eventually, remedy may ease to avoid early use of rear areas. Your teeth grow to an unlikely quantity of pressure, as you chew down. In case your top teeth do not match it’ll trigger your teeth that are back to degrade. The most frequently encountered type of therapy is the braces (or retainer) and head-gear. But, a lot people complain about suffering with this technique that, unfortunately, is also unavoidable. Sport braces damages, as well as additional individuals have problem in talking. Dental practitioners, though, state several days can be normally disappeared throughout by the hurting. Occasionally annoyance is caused by them. In the event that you’d like to to quit more unpleasant senses, fresh, soft and tedious food must be avoided by you. In addition, tend not to take your braces away unless the medical professional claims so.

It is advised which you just observe your medical professional often for medical examinations to prevent choice possible problems that may appear while getting therapy. You are going to be approved using a specific dental hygiene, if necessary. Dental specialist may look-out of managing and id malocclusion now. Orthodontia – the main specialization of medication – mainly targets repairing chin problems and teeth, your grin as well as thus your sting. Dentist, however, won’t only do chin remedies and crisis teeth. They also handle tender to severe dental circumstances which may grow to states that are risky. You actually have not got to quantify throughout a predicament your life all. See dental specialist San – Direction Posts, and you’ll notice only but of stunning your smile plenty will soon be.

MIT engineers develop “blackest black” material to date

With apologies to “Spinal Tap,” it appears that black can, indeed, get more black.

MIT engineers report today that they have cooked up a material that is 10 times blacker than anything that has previously been reported. The material is made from vertically aligned carbon nanotubes, or CNTs — microscopic filaments of carbon, like a fuzzy forest of tiny trees, that the team grew on a surface of chlorine-etched aluminum foil. The foil captures more than 99.96 percent of any incoming light, making it the blackest material on record.

The researchers have published their findings today in the journal ACS-Applied Materials and Interfaces. They are also showcasing the cloak-like material as part of a new exhibit today at the New York Stock Exchange, titled “The Redemption of Vanity.”

The artwork, a collaboration between Brian Wardle, professor of aeronautics and astronautics at MIT, and his group, and MIT artist-in-residence Diemut Strebe, features a 16.78-carat natural yellow diamond, estimated to be worth $2 million, which the team coated with the new, ultrablack CNT material. The effect is arresting: The gem, normally brilliantly faceted, appears as a flat, black void.

Wardle says the CNT material, aside from making an artistic statement, may also be of practical use, for instance in optical blinders that reduce unwanted glare, to help space telescopes spot orbiting exoplanets.

“There are optical and space science applications for very black materials, and of course, artists have been interested in black, going back well before the Renaissance,” Wardle says. “Our material is 10 times blacker than anything that’s ever been reported, but I think the blackest black is a constantly moving target. Someone will find a blacker material, and eventually we’ll understand all the underlying mechanisms, and will be able to properly engineer the ultimate black.”

Wardle’s co-author on the paper is former MIT postdoc Kehang Cui, now a professor at Shanghai Jiao Tong University.

Into the void

Wardle and Cui didn’t intend to engineer an ultrablack material. Instead, they were experimenting with ways to grow carbon nanotubes on electrically conducting materials such as aluminum, to boost their electrical and thermal properties.

But in attempting to grow CNTs on aluminum, Cui ran up against a barrier, literally: an ever-present layer of oxide that coats aluminum when it is exposed to air. This oxide layer acts as an insulator, blocking rather than conducting electricity and heat. As he cast about for ways to remove aluminum’s oxide layer, Cui found a solution in salt, or sodium chloride.

At the time, Wardle’s group was using salt and other pantry products, such as baking soda and detergent, to grow carbon nanotubes. In their tests with salt, Cui noticed that chloride ions were eating away at aluminum’s surface and dissolving its oxide layer.

“This etching process is common for many metals,” Cui says. “For instance, ships suffer from corrosion of chlorine-based ocean water. Now we’re using this process to our advantage.”

Cui found that if he soaked aluminum foil in saltwater, he could remove the oxide layer. He then transferred the foil to an oxygen-free environment to prevent reoxidation, and finally, placed the etched aluminum in an oven, where the group carried out techniques to grow carbon nanotubes via a process called chemical vapor deposition.

By removing the oxide layer, the researchers were able to grow carbon nanotubes on aluminum, at much lower temperatures than they otherwise would, by about 100 degrees Celsius. They also saw that the combination of CNTs on aluminum significantly enhanced the material’s thermal and electrical properties — a finding that they expected.

What surprised them was the material’s color.

“I remember noticing how black it was before growing carbon nanotubes on it, and then after growth, it looked even darker,” Cui recalls. “So I thought I should measure the optical reflectance of the sample.

“Our group does not usually focus on optical properties of materials, but this work was going on at the same time as our art-science collaborations with Diemut, so art influenced science in this case,” says Wardle.

Wardle and Cui, who have applied for a patent on the technology, are making the new CNT process freely available to any artist to use for a noncommercial art project.

“Built to take abuse”

Cui measured the amount of light reflected by the material, not just from directly overhead, but also from every other possible angle. The results showed that the material absorbed greater than 99.995 percent of incoming light, from every angle. In essence, if the material contained bumps or ridges, or features of any kind, no matter what angle it was viewed from, these features would be invisible, obscured in a void of black.  

The researchers aren’t entirely sure of the mechanism contributing to the material’s opacity, but they suspect that it may have something to do with the combination of etched aluminum, which is somewhat blackened, with the carbon nanotubes. Scientists believe that forests of carbon nanotubes can trap and convert most incoming light to heat, reflecting very little of it back out as light, thereby giving CNTs a particularly black shade.

“CNT forests of different varieties are known to be extremely black, but there is a lack of mechanistic understanding as to why this material is the blackest. That needs further study,” Wardle says.

The material is already gaining interest in the aerospace community. Astrophysicist and Nobel laureate John Mather, who was not involved in the research, is exploring the possibility of using Wardle’s material as the basis for a star shade — a massive black shade that would shield a space telescope from stray light.

“Optical instruments like cameras and telescopes have to get rid of unwanted glare, so you can see what you want to see,” Mather says. “Would you like to see an Earth orbiting another star? We need something very black. … And this black has to be tough to withstand a rocket launch. Old versions were fragile forests of fur, but these are more like pot scrubbers — built to take abuse.”

Startup uses virtual reality to help seniors re-engage with the world

Reed Hayes MBA ’17 wasn’t quite sure what to expect. He stood inside an assisted living facility in front of an elderly man struggling with dementia. The man sat slouched in his wheelchair, unmoving, his eyes barely open. Hayes had enrolled in MIT’s Sloan School of Management with the idea of helping older adults overcome depression and isolation through the immersive world of virtual reality. Now he needed to test his idea.

Hayes turned on a virtual reality experience featuring a three-dimensional painting by Vincent Van Gogh and a classical piano playing in the background. Nervously, he placed the headset on the man. What happened next stunned everyone in the room.

“He just came alive,” Hayes remembers. “He started moving around, tapping his feet, laughing. He was all of a sudden much more engaged in the world, and this from someone who was slouched over, to now kind of bouncing around. [My classmate] Dennis and I looked at each other like, ‘Holy cow, we might be onto something.’ It was remarkable.”

It would not be the last time Hayes and Dennis Lally MBA ’17 saw the transformative impact of virtual reality (VR). Their startup, Rendever, which they founded with Kyle Rand and Thomas Neumann, has since brought its VR experiences to more than 100 senior living communities, and has launched in hospitals to extend the enthralling world of VR to patients of all ages.

“Starting Rendever was one of the most important things I’ve done in my life,” Hayes says. “It holds a special place in my heart, and it’s probably the most material impact I’ll have in my life.”

Rendever’s main product is its resident engagement platform, which offers users a variety of games and activities like virtual scuba diving and hiking, and includes content from diverse sources that let users travel almost anywhere in the world. One of the most important features of the platform, though, is its ability to sync to multiple headsets at once, prompting social group activities.

“It’s amazing to see them point things out to each other and engage with one another, yelling ‘Look left!’ Or ‘There’s a puppy at our feet!’” says Grace Andruszkiewicz, Rendever’s director of marketing and partnerships. “Or, if they’re in Paris, someone might say, ‘I was in Paris in 1955 and there was this cute café,’ and people start adding details and telling their own stories. That’s where the magic happens.”

The company, which uses off-the-shelf headsets, also offers a family engagement portal so relatives can upload personal content like photos or videos that let users relive fond memories or be present in places they can’t physically be in. For example, family members can borrow a 360-degree camera, or purchase their own, to take to weddings or on family vacations.

The idea for the company was first sketched out by Hayes on a napkin at MIT’s Muddy Charles Pub as part of a pitch to Lally shortly after they’d come to MIT. The co-founders brought on Rand and Neumann during the delta v summer accelerator, which is run out of the Martin Trust Center for Entrepreneurship. They officially launched the company in the fall of 2016.

Since then, everyone at the company has racked up a series of unforgettable memories watching older adults use the platform. Lally remembers one early test when they gave an older woman the experience of seeing the Notre-Dame cathedral in France.

“She was so ecstatic to be able to see this church from the inside, something she had dreamt about, and we were able to kind of fulfill a lifelong dream of hers,” Lally says. Indeed, the company says it specializes in helping seniors cross items off their bucket list.

Rendever’s team adds original content to its platform twice a month, much of it based on feedback from residents at the communities that subscribe to the service. Subscriptions include headsets, a control tablet, a large content library, training, support, and warranties.

The company also helps nursing homes deliver personalized content to their residents, which makes for some of the most powerful experiences.

“Once there was an older adult who just kept saying ‘I want to go home,’ but she was in an assisted living community because she was showing signs of dementia,” Hayes remembers. “With the technology that we’d built, we were able to type in the address of her home and take her there. And she started crying tears of joy. She kept saying, ‘This is the most beautiful place in the world.’”

Now the company is working to reproduce in clinical trials the results they’ve seen with individual clients.

A study performed in conjunction with the MIT AgeLab and presented at the 2018 International Conference on Human Aspects of IT for the Aged Population compared social VR experiences for older adults with watching the same scenes on a television. The researchers found that the people who had shared these experiences through VR were significantly less likely to report depression or social isolation and more likely to feel better about their overall well-being.

“To this day, the power of the shared experience remains at the heart of our philosophy, and we owe much of that to our roots at MIT and ongoing collaboration with the MIT AgeLab,” says Rendever CEO Kyle Rand.

Rendever is also deploying its system outside of senior living communities. A study with UCHealth in Colorado used Rendever’s VR as a distraction for patients undergoing unpleasant treatments such as chemotherapy. After the program, 88 percent of participants said they’d use VR again.

The system has worked so well that many of Rendever’s employees have used it with their own aging relatives. Before Andruszkiewicz accepted a job at the company, she asked if she could take a demo set to her 89-year-old grandmother.

“She started telling me stories that I’d never heard before, and she and I have a really close relationship, so it was surprising that some of her memories had come back,” Andruszkiewicz says. “That sealed the deal for me.”

Factors such as quality of life and mental stimulation have long been suspected to influence impairments related to aging. Rendever’s team is hoping the transformations they’ve seen can be replicated through peer-reviewed research. One particular transformation sticks with everyone.

For years, an elderly woman named Mickey was the most outgoing and friendly person in her Connecticut assisted living community. She knew everyone’s name, was a regular at community events, and always had a smile on her face.

Then she was diagnosed with dementia. One of her first symptoms was expressive aphasia, a disorder that robbed her of her ability to speak. Mickey’s silence left a void in the community and saddened residents and staff members.

Then Rendever’s team came in to do training. A staff member, with tears in his eyes, told the team about Mickey, so they cued up a scene of golden retriever puppies and put the headset on her.

“She completely lights up,” Andruszkiewicz recalls. “Mickey was trying to pet the puppies, and calling them over, and she was talking throughout the experience.”

From a clinical perspective, it’s too early to say that VR improves symptoms related to aging, but when Rendever followed up with the Connecticut community six months later, they learned something interesting: Mickey had continued using Rendever, and continued communicating with old friends who never thought they’d hear from her again.

Letter regarding update on the MIT Schwarzman College of Computing

The following letter was sent to MIT faculty members on Sept. 11 by Provost Martin A. Schmidt.

Dear Colleagues,

I am writing to share with you a status update from Dean Dan Huttenlocher on the formation of the MIT Stephen A. Schwarzman College of Computing. This is the first of what I expect to be regular updates over the next few months as the initial set of academic units and structure of the college are being determined. As I have mentioned in the past, the college stands on three legs: advancing computer science, building strong connections between computer science and all other disciplines at MIT, and being intentional in contemplating the societal implications of computing. Dean Huttenlocher is working to advance each of these legs as we build the college, and his note below provides an update on where we stand. Out of necessity, the plans for each leg are moving at different paces, but it is important for me to stress that each one of these legs is vital to the college’s overall success.  

Sincerely,

Martin A. Schmidt

MIT Schwarzman College of Computing

 September 11, 2019 Update

Dan Huttenlocher, Dean

I would first like to thank the MIT community—particularly those who have given so generously of their time and ideas—for their continued support in the creation and establishment of the MIT Schwarzman College of Computing (SCC). While much remains to be done, I am excited to share this update on progress made over the summer, including:

  1. Establishment of an advisory group, building on the College of Computing Task Force.
  2. Initial administrative leadership appointments and planned additional appointments over the coming year.
  3. Process for creating an organizational plan for the college.
  4. Organizational plan for the Department of Electrical Engineering and Computer Science (EECS).
  5. Summary of the college’s mission and scope.

1. Advisory Group. I am delighted that members of last year’s College of Computing Task Force Steering Committee have agreed to serve this year as an advisory group for the formation of the SCC, as they have a unique understanding of the opportunities and challenges from their deep involvement in the working groups. The advisory group members are Eran Ben-Joseph, Anantha Chandrakasan, Rick Danheiser, Srinivas Devadas, Benoit Forget, William Freeman, Melissa Nobles, Asu Ozdaglar, Nelson Repenning, Nicholas Roy, Julie Shah, and Troy Van Voorhis. I have also recently met with undergraduate and graduate student groups and will be working with them on avenues for student input, which are likely to largely be through academic units, as those are the main loci of educational programs.

2. Leadership Appointments. I am also pleased to announce initial leadership appointments for the SCC. These take effect immediately and will report directly to me. Daniela Rus will serve as deputy dean of research and continue as director of CSAIL; Asu Ozdaglar will serve as deputy dean of academics and continue as head of EECS; David Kaiser and Julie Shah will serve as co-heads of social and ethical responsibilities of computing; Eileen Ng will serve as assistant dean of administration; moving from the School of Engineering; and Terri Park will serve as director of communications, moving from the Innovation Initiative. Please join me in congratulating them and thanking them for taking on these important new roles.

Over the coming year, we expect to fill other SCC leadership positions, including an assistant or associate dean of equity and inclusion, assistant dean for development, director of a new Center for Advanced Study of Computing, and leadership of a new teaching collaborative for both disciplinary and interdisciplinary computing classes, once the structure of the college and these additional roles becomes more fully defined. We further expect that Institute for Data, Systems, and Society (IDSS), Center for Computational Engineering (CCE), Computer Science and Artificial Intelligence Lab (CSAIL), Laboratory for Information and Decision Systems (LIDS), the Quest for Intelligence, and perhaps other units, will likely become part of the SCC as planning proceeds, and that their directors will be asked to remain in their roles.

3. College Planning Process. Based on last spring’s College of Computing Task Force Working Group reports, a strawman organizational plan for the college is being developed through an iterative process. The school deans and the aforementioned advisory group are reviewing an early draft. A revised strawman will then be circulated to the school councils, and then to the full MIT faculty for feedback and final revision. We expect to begin sharing the strawman more broadly in the coming months through forums to solicit feedback from the MIT community.

We do not foresee much in the way of curriculum or subject changes this school year, as these are defined by the faculty of academic units through deliberative processes, although there will likely be some pilots.

4. EECS Plan. An organizational plan for EECS, developed over the summer based on the task force working group reports, is now being implemented. This plan also was developed iteratively, first involving the co-chairs of the Organizational Structure and Faculty Appointments working groups, followed by the Engineering Council and the EECS faculty.

The plan calls for the department to form three overlapping academic units, as suggested by the Organizational Structure Working Group. These units are termed faculties: a faculty of electrical engineering (EE), a faculty of computer science (CS), and a faculty of artificial intelligence and decision making (AI+D). Each faculty will have a head, and will be overseen by the head of EECS. The faculties will be the main locus of faculty recruiting, mentoring, and promotion for the department. The department will report jointly to the School of Engineering and the Schwarzman College of Computing, and will remain responsible for Course 6. A search committee for the heads of the three faculties has just been formed. No specific curricular changes are foreseen at this time, as those will take time and will be led by the faculties and the department.

5. Mission and Scope. The planning efforts for the Schwarzman College of Computing are driven by the following mission and scope of activities.

Widespread advances in computing—from hardware to software to algorithms to artificial intelligence—have improved people’s lives in myriad ways, with numerous promising opportunities on the horizon. Yet at the same time, we face critical and growing challenges regarding the social and ethical implications and responsibilities of computing, particularly with the increasing applicability of artificial intelligence. Moreover, despite the unprecedented growth of computer science, artificial intelligence, and related academic program areas, substantial unmet demand for expertise in computing remains, as does the constant need to keep up with rapidly changing academic content.

The mission of the MIT Stephen A. Schwarzman College of Computing (SCC) is to address the challenges and opportunities of the computing age by transforming the capabilities of academia in three key areas:

  1. Computing fields: Support the rapid growth and evolution of computer science and computational areas of allied fields such as electrical engineering, as reflected notably in the rise of artificial intelligence;
  2. Computing across disciplines: Facilitate productive research and teaching collaborations between computing and other fields, rather than place one field in service of another;
  3. Social and ethical aspects of computing: Lead the development of and changes in academic research and education, and effectively inform practice and policy in industry and government.

These three areas are integral to the SCC mission. Moreover, they are not independent but rather should inform and amplify one another.

Based on the College of Computing Task Force Working Group reports and follow-on planning over the summer, the SCC is expected to:

  • Have academic units that are more flexible and interconnected than conventional departments and schools while simultaneously reinforcing MIT’s strength in computing fields such as CS and large parts of EE, which generally have traditional departments at other top institutions. The SCC is expected to have multiple types of academic structures to meet this variety of needs.
  • Involve faculty from a broad range of departments and schools who are engaged in computing education and research through a variety of programs and affiliations including: (i) the Common Ground, a teaching collaborative for both disciplinary and interdisciplinary computing classes, (ii) centers that offer graduate programs in computing, (iii) computing research labs and centers, and (iv) additional scholarly activities in computing, including workshops and other convenings.
  • Lead in the social and ethical aspects of computing with: (i) education that helps develop “habits of mind and action” for those who create and deploy computing technologies, (ii) research that brings technological, social science, and humanistic approaches together, and (iii) impact on government and corporate policy as well as the development of a better understanding among the general public.
  • Lead in the rapid evolution of computing fields, both in research and education, currently exemplified by the rise of areas such as AI and machine learning (ML).
  • Deliver outstanding undergraduate and graduate education in: (i) CS and EE, (ii) evolving areas of computing such as AI/ML and others, (iii) computing across the disciplines, and (iv) social and ethical aspects of computing.
  • Improve equity and inclusion in computing at MIT with the aim of helping address diversity in computing with regard to gender, race, and range of backgrounds and experience. Focus on increasing the diversity of top faculty candidates, with new programs for faculty, postdocs, and PhD students, as well as improvements to existing programs. Broaden participation in computing classes and academic programs at all levels, including improvements to the climate and the development of more effective connections with other, more diverse, disciplines.

The SCC will include both existing and new academic units and programs. Bringing existing activities together will help facilitate coordination and alignment of computing education and research as well as provide new opportunities for improvement, such as bringing more coordination to the academic programs that mix computing and other disciplines and to the teaching of social and ethical issues in computing. Creating new programs and units will help address areas that are not well covered by existing ones and also help in fostering new connections.

When rats work to protect human safety

During a trip to Brussels in 2013, Jia Hui Lee decided to visit the Royal Museum of the Armed Forces and Military History. While there, he stumbled upon a poster depicting a rat on the ground next to a partially visible land mine. It was April 4, International Mine Awareness Day, and the poster was part of a display about the use of rodents to detect land mines.

“When you think about war, you think about these big technological tools, vehicles, and systems. Then to see this image of a rat, it was quite jarring and piqued my interest immediately,” says Lee, a fifth-year doctoral student in MIT’s History, Anthropology, and Science, Technology, and Society (HASTS) program.

He had been thinking about humanity’s relationship with other animals and the environment during the era of climate change, and the display provided the kernel of his PhD thesis, which looks at human-rodent interactions in Tanzania, where humans are training rats to detect landmines, as well as tuberculosis.

As a queer man of color, Lee has frequently questioned ideas about power, privilege, and people’s places in society, including his own. With his graduate work, he is extending these questions to consider cross-species interactions and what they say about the impact of technology on society and politics. Throughout his studies, the ethical considerations of anthropology, including who gets to speak for the experiences of others and what experiences are studied in the first place, have remained central to Lee’s work.

Helpers, friends, vermin, enemies

For his thesis, Lee completed 15 months of field research in Tanzania examining how trainers interacted with, talked about, and ultimately conditioned rats in order to get them to find land mines. He later spent two months in Cambodia to see how the animals worked in the field. The Tanzanian-trained rats were deployed in an area to clear possible land mines, and after they determined that there were no active mines in that site, Lee took a walk through the area. He jokes that the fact that he’s still alive to talk about the experience demonstrates the success of the training.

Lee is very careful about how he talks about the nonhuman animals in his research, to acknowledge the cross-cultural differences in how humans think about them. For instance, many people in Tanzania consider rats to be intelligent and helpful, whereas in New York City, for example, they are more commonly viewed as vermin. Likewise, Lee notes that in the U.S. and European countries, dogs are generally viewed as humans’ best friends and treated as part of the family. In places like Tanzania and Kenya, however, he says dogs are often viewed as vicious and fierce, because of the historic use of dogs by colonial British police officers to violently control anticolonial protesters, and later as guards against theft.

“The knowledge I hope to produce out of this research is in conversation with zoology, biology, and cognitive science. It includes histories of human-animal interactions which are usually left out in other kinds of disciplines,” Lee says.

His focus on East Africa grew in part out of previous research on the growing science and technology markets in the region. Although the technology scene in East Africa is flourishing, he notes, this growth doesn’t get the same recognition as tech hubs in the West.

“You see a really exciting embrace of science and technology in this region. It’s interesting to think about these types of science and technology projects in East Africa — not Cambridge, Massachusetts, or London. It’s really important to think of East Africa as a location of critical thinking and knowledge production,” he says.

Equity on campus

As a person who is concerned with power and privilege, it is no surprise that Lee has advocated on behalf of the Institute’s graduate community. As a graduate fellow for the Institute Community and Equity Office, Lee worked with Professor Ed Bertschinger and other fellows to find ways to candidly discuss the state of diversity and inclusion at MIT.

“Over the course of a semester, we hosted discussion lunches that included students, staff, and faculty to share best practices in different departments that foster inclusion at the Institute,” Lee says.

He also served on the Working Group on Graduate Student Tuition Models to gather data about grad students’ experiences with some of the Institute’s funding structures. He compiled the stories of various members of the graduate community to present to the Institute’s administration in order to demonstrate the ways that students’ well-being could be enhanced. MIT’s senior leadership has now begun seeking ways to alleviate financial insecurity faced by some of the Institute’s graduate students and has also launched a new effort to better support those with families.

Citizen of the world

Lee has wide-ranging interests in history and culture, and one of his favorite things to do in his free time is to walk around and analyze Boston’s architecture. After living in the area on and off for about a decade, he says he really enjoys getting to know Boston and its physical changes intimately. He thinks it’s fascinating to think about the city’s transformation from a part of the sea hundreds of years ago to the urban hub it is now. Throughout his travels the past few years, he has picked up bits of art and architectural history that have informed his understanding of some of Boston’s iconic landmarks.

“In Boston, there’s a lot of Italian influences on certain architecture, so the Isabella Stewart Gardner Museum looks like an Italian Renaissance palazzo, which is so quirky. But then Back Bay, especially Commonwealth Avenue, was designed to resemble a French boulevard,” Lee explains.

Beyond Boston and Tanzania, Lee has been all over the world, and picked up various languages in the process. He speaks Malay, Swahili, French, English, Hindi, and Urdu, and a bit of several Chinese dialects. In his adventures, Lee has also recognized the value of being alone, and he advocates for solo travel. It invites unique experiences, he says, which for him has included being brought to dance clubs and even a Tanzanian wedding.

“I feel like the likelihood of randomly meeting a person or stumbling into an event or a festival is so much higher than if you’re traveling with somebody. And when you’re alone, I think people do draw you into whatever events they are going to,” Lee says.

Taking the next giant leaps

In July, the world celebrated the 50th anniversary of the historic Apollo 11 moon landing. MIT played an enormous role in that accomplishment, helping to usher in a new age of space exploration. Now MIT faculty, staff, and students are working toward the next great advances — ones that could propel humans back to the moon, and to parts still unknown.  

“I am hard-pressed to think of another event that brought the world together in such a collective way as the Apollo moon landing,” says Daniel Hastings, the Cecil and Ida Green Education Professor and head of the Department of Aeronautics and Astronautics (AeroAstro). “Since the spring, we have been celebrating the role MIT played in getting us there and reflecting on how far technology has come in the past five decades.” 

“Our community continues to build on the incredible legacy of Apollo,” Hastings adds. Some aspects of future of space exploration, he notes, will follow from lessons learned. Others will come from newly developed technologies that were unimaginable in the 1960s. And still others will arise from novel collaborations that will fuel the next phases of research and discovery. 

“This is a tremendously exciting time to think about the future of space exploration,” Hastings says. “And MIT is leading the way.”

Sticking the landing

Making a safe landing — anywhere — can be a life-or-death situation. On Earth, thanks to a network of global positioning satellites and a range of ground-based systems, pilots have instantaneous access to real-time data on every aspect of a landing environment. The moon, however, is not home to any of this precision navigation technology, making it rife with potential danger. 

NASA’s recent decision to return to moon has made this a more pressing challenge — and one that MIT has risen to before. The former MIT Instrumentation Lab (now the independent Draper) developed the guidance systems that enabled Neil Armstrong and Buzz Aldrin to land safely on the moon, and that were used on all Apollo spacecraft. This system relied on inertial navigation, which integrates acceleration and velocity measurements from electronic sensors on the vehicle and a digital computer to determine the spacecraft’s location. It was a remarkable achievement — the first time that humans traveled in a vehicle controlled by a computer.  

Today, working in MIT’s Aerospace Controls Lab with Jonathan How, the Richard Cockburn Maclaurin Professor of Aeronautics and Astronautics, graduate student Lena Downes — who is also co-advised by Ted Steiner at Draper — is developing a camera-based navigation system that can sense the terrain beneath the landing vehicle and use that information to update the location estimation. “If we want to explore a crater to determine its age or origin,” Downes explains, “we will need to avoid landing on the more highly-sloped rim of the crater. Since lunar landings can have errors as high as several kilometers, we can’t plan to land too closely to the edge.” 

Downes’s research on crater detection involves processing images using convolutional neural networks and traditional computer vision methods. The images are combined with other data, such as previous measurements and known crater location information, enabling increased precision vehicle location estimation.

“When we return to the moon, we want to visit more interesting locations, but the problem is that more interesting can often mean more hazardous,” says Downes. “Terrain-relative navigation will allow us to explore these locations more safely.”

“Make it, don’t take it”

NASA also has its sights set on Mars — and with that objective comes a very different challenge: What if something breaks? Given that the estimated travel time to Mars is between 150 and 300 days, there is a relatively high chance that something will break or malfunction during flight. (Just ask Jim Lovell or Fred Haise, whose spacecraft needed serious repairs only 55 hours and 54 minutes into the Apollo 13 mission.)

Matthew Moraguez, a graduate student in Professor Olivier L. de Weck’s Engineering Systems Lab, wants to empower astronauts to manufacture whatever they need, whenever they need it. (“On the fly,” you could say).

“In-space manufacturing (ISM) — where astronauts can carry out the fabrication, assembly, and integration of components — could revolutionize this paradigm,” says Moraguez. “Since components wouldn’t be limited by launch-related design constraints, ISM could reduce the cost and improve the performance of existing space systems while also enabling entirely new capabilities.”

Historically, a key challenge facing ISM is correctly pairing the components with manufacturing processes needed to produce them. Moraguez approached this problem by first defining the constraints created by a stressful launch environment, which can limit the size and weight of a payload. He then itemized the challenges that could potentially be alleviated by ISM and developed cost-estimating relationships and performance models to determine the exact break-even point at which ISM surpasses the current approach. 

Moraguez points to Made in Space, an additive manufacturing facility that is currently in use on the International Space Station. The facility produces tools and other materials as needed, reducing both the cost and the wait time of replenishing supplies from Earth. Moraguez is now developing physics-based manufacturing models that will determine the size, weight, and power required for the next generation of ISM equipment.

“We have been able to evaluate the commercial viability of ISM across a wide range of application areas,” says Moraguez. “Armed with this framework, we aim to determine the best components to produce with ISM and their appropriate manufacturing processes. We want to develop the technology to a point where it truly revolutionizes the future of spaceflight. Ultimately, it could allow humans to travel further into deep space for longer durations than ever before,” he says. 

Partnering with industry

The MIT Instrumentation Lab was awarded the first contract for the Apollo program in 1961. In one brief paragraph on a Western Union telegram, the lab was charged with developing the program’s guidance and control system. Today the future of space exploration depends as much as ever on deep collaborations. 

Boeing is a longstanding corporate partner of MIT, supporting such efforts as the Wright Brother’s Wind Tunnel renovation and the New Engineering Education Transformation (NEET) program, which focuses on modern industry and real-world projects in support of MIT’s educational mission. In 2020, Boeing is slated to open the Aerospace and Autonomy Center in Kendall Square, which will focus on advancing enabling technologies for autonomous aircraft.

Just last spring the Institute announced a new relationship with Blue Origin in which it will begin planning and developing new payloads for missions to the moon. These new science experiments, rovers, power systems, and more will hitch a ride to the moon via Blue Moon, Blue Origin’s flexible lunar lander. 

Working with IBM, MIT researchers are exploring the potential uses of artificial intelligence in space research. This year, IBM’s AI Research Week (Sept. 16-20) will feature an event, co-hosted with AeroAstro, in which researchers will pitch ideas for projects related to AI and the International Space Station.

“We are currently in an exciting new era marked by the development and growth of entrepreneurial private enterprises driving space exploration,” says Hastings. “This will lead to new and transformative ways for human beings to travel to space, to create new profit-making ventures in space for the world’s economy, and, of course, lowering the barrier of access to space so many other countries can join this exciting new enterprise.”

How “information gerrymandering” influences voters

Many voters today seem to live in partisan bubbles, where they receive only partial information about how others feel regarding political issues. Now, an experiment developed in part by MIT researchers sheds light on how this phenomenon influences people when they vote.

The experiment, which placed participants in simulated elections, found not only that communication networks (such as social media) can distort voters’ perceptions of how others plan to vote, but also that this distortion can increase the chance of electoral deadlock or bias overall election outcomes in favor of one party.  

“The structure of information networks can really fundamentally influence the outcomes of elections,” says David Rand, an associate professor at the MIT Sloan School of Management and a co-author of a new paper detailing the study. “It can make a big difference and is an issue people should be taking seriously.”

More specifically, the study found that “information gerrymandering” can bias the outcome of a vote, such that one party wins up to 60 percent of the time in simulated elections of two-party situations where the opposing groups are equally popular. In a follow-up empirical study of the U.S. federal government and eight European legislative bodies, the researchers also identified actual information networks that show similar patterns, with structures that could skew over 10 percent of the vote in the study’s experiments.

The paper, “Information gerrymandering and undemocratic decisions,” is being published today in Nature.

The authors are Alexander J. Stewart of the University of Houston; Mohsen Mosleh, a research scientist at MIT Sloan; Marina Diakonova of the Environmental Change Institute at Oxford University; Antonio Arechar, an associate research scientist at MIT Sloan and a researcher at the Center for Research and Teaching in Economics (CIDE) in Aguascalientes, Mexico; Rand, who is also the principal investigator for MIT Sloan’s Human Cooperation Lab; and Joshua B. Plotkin of the University of Pennsylvania. Stewart is the lead author.

Formal knowledge

While there is a burgeoning academic literature on media preferences, political ideology, and voter choices, the current study is an effort to create general models of the fundamental influence that information networks can have. Through abstract mathematical models and experiments, the researchers can analyze how strongly networks can influence voter behavior, even when long-established layers of voter identity and ideology are removed from the political arena.

“Part of the contribution here is to try to formalize how information about politics flows through social networks, and how that can influence voters’ decisions,” says Stewart.

The study used experiments involving 2,520 particpants, who played a “voter game” in one of a variety of conditions. (The participants were recruited via Amazon’s Mechanical Turk platform and took part in the simulated elections via Breadboard, a platform generating multiplayer network interactions.) The players were divided into two teams, a “yellow” team and a “purple” team, usually with 24 people on each side, and were allowed to change their voting intentions in response to continuously updated polling data.

The participants also had incentives to try to produce certain vote outcomes reflective of what the authors call a “compromise worldview.” For instance, players would receive a (modest) payoff if their team received a super-majority vote share; a smaller payoff if the other team earned a super-majority; and zero payoff if neither team reached that threshold. The election games usually lasted four minutes, during which time each voter had to decide how to vote.

In general, voters almost always voted for their own party when the polling data showed it had a chance of reaching a super-majority share. They also voted for their own side when the polling data showed a deadlock was likely. But when the opposing party was likely to achieve a super-majority, half the players would vote for it, and half would continue to vote for their own side.

During a baseline series of election games where all the players had unbiased, random polling information, each side won roughly a quarter of the time, and a deadlock without a super-majority resulted about half the time. But the researchers also varied the game in multiple ways. In one iteration of the game, they added information gerrymandering to the polls, such that some members of one team were placed inside the other team’s echo chamber. In another iteration, the research team deployed online bots, comprising about 20 percent of voters, to behave like “zealots,” as the scholars called them; the bots would strongly support one side only.

After months of iterations of the game, the researchers concluded that election outcomes could be heavily biased by the ways in which the polling information was distributed over the networks, and by the actions of the zealot bots. When members of one party were led to believe that most others were voting for the other party, they often switched their votes to avoid deadlock.

“The network experiments are important, because they allow us to test the predictions of the mathematical models,” says Mosleh, who led the experimental portion of the research “When we added echo chambers, we saw that deadlock happened much more often — and, more importantly, we saw that information gerrymandering biased the election results in favor of one party over the other.”

The empirical case

As part of the larger project, the team also sought out some empirical information about similar scenarios among elected governments. There are many instances where elected officials might either support their first-choice legislation, settle for a cross-partisan compromise, or remain in deadlock. In those cases, having unbiased information about the voting intentions of other legislators would seem to be very important.

Looking at the co-sponsorship of bills in the U.S. Congress from 1973 to 2007, the researchers found that the Democratic Party had greater “influence assortment” — more exposure to the voting intentions of people in their own party — than the Republican Party of the same time. However, after Republicans gained control of Congress in 1994, their own influence assortment became equivalent to that of the Democrats, as part of a highly polarized pair of legislative influence networks. The researchers found similar levels of polarization in the influence networks of six out of the eight European parliaments they evaluated, generally during the last decade.

Rand says he hopes the current study will help generate additional research by other scholars who want to keep exploring these dynamics empirically.

“Our hope is that laying out this information gerrymandering theory, and introducing this voter game, we will spur new research around these topics to understand how these effects play out in real-world networks,” Rand says.

Support for the research was provided by the U.S. Defense Advanced Research Projects Agency, the Ethics and Governance of Artificial Intelligence Initiative of the Miami Foundation, the Templeton World Charity Foundation and the John Templeton Foundation, the Army Research Office, and the David and Lucile Packard Foundation.

MIT report examines how to make technology work for society

Automation is not likely to eliminate millions of jobs any time soon — but the U.S. still needs vastly improved policies if Americans are to build better careers and share prosperity as technological changes occur, according to a new MIT report about the workplace.

The report, which represents the initial findings of MIT’s Task Force on the Work of the Future, punctures some conventional wisdom and builds a nuanced picture of the evolution of technology and jobs, the subject of much fraught public discussion.

The likelihood of robots, automation, and artificial intelligence (AI) wiping out huge sectors of the workforce in the near future is exaggerated, the task force concludes — but there is reason for concern about the impact of new technology on the labor market. In recent decades, technology has contributed to the polarization of employment, disproportionately helping high-skilled professionals while reducing opportunities for many other workers, and new technologies could exacerbate this trend.

Moreover, the report emphasizes, at a time of historic income inequality, a critical challenge is not necessarily a lack of jobs, but the low quality of many jobs and the resulting lack of viable careers for many people, particularly workers without college degrees. With this in mind, the work of the future can be shaped beneficially by new policies, renewed support for labor, and reformed institutions, not just new technologies. Broadly, the task force concludes, capitalism in the U.S. must address the interests of workers as well as shareholders.

“At MIT, we are inspired by the idea that technology can be a force for good. But if as a nation we want to make sure that today’s new technologies evolve in ways that help build a healthier, more equitable society, we need to move quickly to develop and implement strong, enlightened policy responses,” says MIT President L. Rafael Reif, who called for the creation of the Task Force on the Work of the Future in 2017.

“Fortunately, the harsh societal consequences that concern us all are not inevitable,” Reif adds. “Technologies embody the values of those who make them, and the policies we build around them can profoundly shape their impact. Whether the outcome is inclusive or exclusive, fair or laissez-faire, is therefore up to all of us. I am deeply grateful to the task force members for their latest findings and their ongoing efforts to pave an upward path.”

“There is a lot of alarmist rhetoric about how the robots are coming,” adds Elisabeth Beck Reynolds, executive director of the task force, as well as executive director of the MIT Industrial Performance Center. “MIT’s job is to cut through some of this hype and bring some perspective to this discussion.”

Reynolds also calls the task force’s interest in new policy directions “classically American in its willingness to consider innovation and experimentation.”

Anxiety and inequality

The core of the task force consists of a group of MIT scholars. Its research has drawn upon new data, expert knowledge of many technology sectors, and a close analysis of both technology-centered firms and economic data spanning the postwar era.

The report addresses several workplace complexities. Unemployment in the U.S. is low, yet workers have considerable anxiety, from multiple sources. One is technology: A 2018 survey by the Pew Research Center found that 65 to 90 percent of respondents in industrialized countries think computers and robots will take over many jobs done by humans, while less than a third think better-paying jobs will result from these technologies.

Another concern for workers is income stagnation: Adjusted for inflation, 92 percent of Americans born in 1940 earned more money than their parents, but only about half of people born in 1980 can say that.

“The persistent growth in the quantity of jobs has not been matched by an equivalent growth in job quality,” the task force report states.

Applications of technology have fed inequality in recent decades. High-tech innovations have displaced “middle-skilled” workers who perform routine tasks, from office assistants to assembly-line workers, but these innovations have complemented the activities of many white-collar workers in medicine, science and engineering, finance, and other fields. Technology has also not displaced lower-skilled service workers, leading to a polarized workforce. Higher-skill and lower-skill jobs have grown, middle-skill jobs have shrunk, and increased earnings have been concentrated among white-collar workers.

“Technological advances did deliver productivity growth over the last four decades,” the report states. “But productivity growth did not translate into shared prosperity.”

Indeed, says David Autor, who is the Ford Professor of Economics at MIT, associate head of MIT’s Department of Economics, and a co-chair of the task force, “We think people are pessimistic because they’re on to something. Although there’s no shortage of jobs, the gains have been so unequally distributed that most people have not benefited much. If the next four decades of automation are going to look like the last four decades, people have reason to worry.”

Productive innovations versus “so-so technology”

A big question, then, is what the next decades of automation have in store. As the report explains, some technological innovations are broadly productive, while others are merely “so-so technologies” — a term coined by economists Daron Acemoglu of MIT and Pascual Restrepo of Boston University to describe technologies that replace workers without markedly improving services or increasing productivity.

For instance, electricity and light bulbs were broadly productive, allowing the expansion of other types of work. But automated technology allowing for self-check-out at pharmacies or supermarkets merely replaces workers without notably increasing efficiency for the customer or productivity.

“That’s a strong labor-displacing technology, but it has very modest productivity value,” Autor says of these automated systems. “That’s a ‘so-so technology.’ The digital era has had fabulous technologies for skill complementarity [for white-collar workers], but so-so technologies for everybody else. Not all innovations that raise productivity displace workers, and not all innovations that displace workers do much for productivity.”

Several forces have contributed to this skew, according to the report. “Computers and the internet enabled a digitalization of work that made highly educated workers more productive and made less-educated workers easier to replace with machinery,” the authors write.

Given the mixed record of the last four decades, does the advent of robotics and AI herald a brighter future, or a darker one? The task force suggests the answer depends on how humans shape that future. New and emerging technologies will raise aggregate economic output and boost wealth, and offer people the potential for higher living standards, better working conditions, greater economic security, and improved health and longevity. But whether society realizes this potential, the report notes, depends critically on the institutions that transform aggregate wealth into greater shared prosperity instead of rising inequality.

One thing the task force does not foresee is a future where human expertise, judgment, and creativity are less essential than they are today.  

“Recent history shows that key advances in workplace robotics — those that radically increase productivity — depend on breakthroughs in work design that often take years or even decades to achieve,” the report states.

As robots gain flexibility and situational adaptability, they will certainly take over a larger set of tasks in warehouses, hospitals, and retail stores — such as lifting, stocking, transporting, cleaning, as well as awkward physical tasks that require picking, harvesting, stooping, or crouching.

The task force members believe such advances in robotics will displace relatively low-paid human tasks and boost the productivity of workers, whose attention will be freed to focus on higher-value-added work. The pace at which these tasks are delegated to machines will be hastened by slowing growth, tight labor markets, and the rapid aging of workforces in most industrialized countries, including the U.S.

And while machine learning — image classification, real-time analytics, data forecasting, and more — has improved, it may just alter jobs, not eliminate them: Radiologists do much more than interpret X-rays, for instance. The task force also observes that developers of autonomous vehicles, another hot media topic, have been “ratcheting back” their timelines and ambitions over the last year.

“The recent reset of expectations on driverless cars is a leading indicator for other types of AI-enabled systems as well,” says David A. Mindell, co-chair of the task force, professor of aeronautics and astronautics, and the Dibner Professor of the History of Engineering and Manufacturing at MIT. “These technologies hold great promise, but it takes time to understand the optimal combination of people and machines. And the timing of adoption is crucial for understanding the impact on workers.”

Policy proposals for the future

Still, if the worst-case scenario of a “job apocalypse” is unlikely, the continued deployment of so-so technologies could make the future of work worse for many people.

If people are worried that technologies could limit opportunity, social mobility, and shared prosperity, the report states, “Economic history confirms that this sentiment is neither ill-informed nor misguided. There is ample reason for concern about whether technological advances will improve or erode employment and earnings prospects for the bulk of the workforce.”

At the same time, the task force report finds reason for “tempered optimism,” asserting that better policies can significantly improve tomorrow’s work.

“Technology is a human product,” Mindell says. “We shape technological change through our choices of investments, incentives, cultural values, and political objectives.”

To this end, the task force focuses on a few key policy areas. One is renewed investment in postsecondary workforce education outside of the four-year college system — and not just in the STEM skills (science, technology, engineering, math) but reading, writing, and the “social skills” of teamwork and judgment.

Community colleges are the biggest training providers in the country, with 12 million for-credit and non-credit students, and are a natural location for bolstering workforce education. A wide range of new models for gaining educational credentials is also emerging, the task force notes. The report also emphasizes the value of multiple types of on-the-job training programs for workers.

However, the report cautions, investments in education may be necessary but not sufficient for workers: “Hoping that ‘if we skill them, jobs will come,’ is an inadequate foundation for constructing a more productive and economically secure labor market.”

More broadly, therefore, the report argues that the interests of capital and labor need to be rebalanced. The U.S., it notes, “is unique among market economies in venerating pure shareholder capitalism,” even though workers and communities are business stakeholders too.

“Within this paradigm [of pure shareholder capitalism], the personal, social, and public costs of layoffs and plant closings should not play a critical role in firm decision-making,” the report states.

The task force recommends greater recognition of workers as stakeholders in corporate decision making. Redressing the decades-long erosion of worker bargaining power will require new institutions that bend the arc of innovation toward making workers more productive rather than less necessary. The report holds that the adversarial system of collective bargaining, enshrined in U.S. labor law adopted during the Great Depression, is overdue for reform.

The U.S. tax code can be altered to help workers as well. Right now, it favors investments in capital rather than labor — for instance, capital depreciation can be written off, and R&D investment receives a tax credit, whereas investments in workers produce no such equivalent benefits. The task force recommends new tax policy that would also incentivize investments in human capital, through training programs, for instance.

Additionally, the task force recommends restoring support for R&D to past levels and rebuilding U.S. leadership in the development of new AI-related technologies, “not merely to win but to lead innovation in directions that will benefit the nation: complementing workers, boosting productivity, and strengthening the economic foundation for shared prosperity.”

Ultimately the task force’s goal is to encourage investment in technologies that improve productivity, and to ensure that workers share in the prosperity that could result.

“There’s no question technological progress that raises productivity creates opportunity,” Autor says. “It expands the set of possibilities that you can realize. But it doesn’t guarantee that you will make good choices.”

Reynolds adds: “The question for firms going forward is: How are they going to improve their productivity in ways that can lead to greater quality and efficiency, and aren’t just about cutting costs and bringing in marginally better technology?”

Further research and analyses

In addition to Reynolds, Autor, and Mindell, the central group within MIT’s Task Force on the Work of the Future consists of 18 MIT professors representing all five Institute schools. Additionally, the project has a 22-person advisory board drawn from the ranks of industry leaders, former government officials, and academia; a 14-person research board of scholars; and eight graduate students. The task force also counsulted with business executives, labor leaders, and community college leaders, among others.

The task force follows other influential MIT projects such as the Commission on Industrial Productivity, an intensive multiyear study of U.S. industry in the 1980s. That effort resulted in the widely read book, “Made in America,” as well as the creation of MIT’s Industrial Performance Center.

The current task force taps into MIT’s depth of knowledge across a full range of technologies, as well as its strengths in the social sciences.

“MIT is engaged in developing frontier technology,” Reynolds says. “Not necessarily what will be introduced tomorrow, but five, 10, or 25 years from now. We do see what’s on the horizon, and our researchers want to bring realism and context to the public discourse.”

The current report is an interim finding from the task force; the group plans to conduct additional research over the next year, and then will issue a final version of the report.

“What we’re trying to do with this work,” Reynolds concludes, “is to provide a holistic perspective, which is not just about the labor market and not just about technology, but brings it all together, for a more rational and productive discussion in the public sphere.”

New science blooms after star researchers die, study finds

The famed quantum physicist Max Planck had an idiosyncratic view about what spurred scientific progress: death. That is, Planck thought, new concepts generally take hold after older scientists with entrenched ideas vanish from the discipline.

“A great scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it,” Planck once wrote.

Now a new study co-authored by MIT economist Pierre Azoulay, an expert on the dynamics of scientific research, concludes that Planck was right. In many areas of the life sciences, at least, the deaths of prominent researchers are often followed by a surge in highly cited research by newcomers to those fields.

Indeed, when star scientists die, their subfields see a subsequent 8.6 percent increase, on average, of articles by researchers who have not previously collaborated with those star scientists. Moreover, those papers published by the newcomers to these fields are much more likely to be influential and highly cited than other pieces of research.

“The conclusion of this paper is not that stars are bad,” says Azoulay, who has co-authored a new paper detailing the study’s findings. “It’s just that, once safely ensconsed at the top of their fields, maybe they tend to overstay their welcome.”

The paper, “Does Science Advance one Funeral at a Time?” is co-authored by Azoulay, the International Programs Professor of Management at the MIT Sloan School of Management; Christian Fons-Rosen, an assistant professor of economics at the University of California at Merced; and Joshua Graff Zivin, a professor of economics at the University of California at San Diego and faculty member in the university’s School of Global Policy and Strategy. It is forthcoming in the American Economic Review.

To conduct the study, the researchers used a database of life scientists that Azoulay and Graff Zivin have been building for well over a decade. In it, the researchers chart the careers of life scientists, looking at accomplishments that include funding awards, published papers and the citations of those papers, and patent statistics.

In this case, Azoulay, Graff Zivin, and Fons-Rosen studied what occurred after the unexpected deaths of 452 life scientists, who were still active in their disciplines. In addition to the 8.6 percent increase in papers by new entrants to those subfields, there was a 20.7 percent decrease in papers by the rather smaller number of scientists who had previously co-authored papers with the star scientists.

Overall, Azoulay notes, the study provides a window into the power structures of scientific disciplines. Even if well-established scientists are not intentionally blocking the work of researchers with alternate ideas, a group of tightly connected colleagues may wield considerable influence over journals and grant awards. In those cases, “it’s going to be harder for those outsiders to make a mark on the domain,” Azoulay notes.

“The fact that if you’re successful, you get to set the intellectual agenda of your field, that is part of the incentive system of science, and people do extraordinary positive things in the hope of getting to that position,” Azoulay notes. “It’s just that, once they get there, over time, maybe they tend to discount ‘foreign’ ideas too quickly and for too long.”

Thus what the researchers call “Planck’s Principle” serves as an unexpected — and tragic — mechanism for diversifying bioscience research.

The researchers note that in referencing Planck, they are extending his ideas to a slightly different setting than the one he himself was describing. In his writing, Planck was discussing the birth of quantum physics — the kind of epochal, paradigm-setting shift that rarely occurs in science. The current study, Azoulay notes, examines what happens in everyday “normal science,” in the phrase of philosopher Thomas Kuhn.

The process of bringing new ideas into science, and then hanging on to them, is only to be expected in many areas of research, according to Azoulay. Today’s seemingly stodgy research veterans were once themselves innovators facing an old guard.

“They had to hoist themselves atop the field in the first place, when presumably they were [fighting] the same thing,” Azoulay says. “It’s the circle of life.”

Or, in this case, the circle of life science.

The research received support from the National Science Foundation, the Spanish Ministry of Economy and Competitiveness, and the Severo Ochoa Programme for Centres of Excellence in R&D.

3Q: Heather Hendershot on the state of US political discourse

Heather Hendershot, professor of comparative media studies, researches conservative media and political movements, film and television genres, and American film history. She has authored several books, including “Open to Debate: How William F. Buckley Put Liberal America on the Firing Line” (2016), and recently received a fellowship at the Stanford Humanities Center, where she will work on her next book during the 2019-20 academic year. SHASS Communications spoke with Hendershot about the current state of political media and discourse in the United States.
 
Q: Your book, “Open to Debate, examined how William F. Buckley’s television program offered deeply intellectual and stimulating conversations with and among individuals who had opposing views. To many, it seems the 2016 presidential election ushered in an era of contentious, hyperpartisan shouting matches. Why don’t we currently have the type of thoughtful dialogue that Buckley provided?

A: My 2016 book argues that “Firing Line,” a public affairs show that aired (mostly) on PBS from 1966 to 1999, offers a model for civil debate that focused on ideas over emotion. At the same time, the show made space for humor, and people did sometimes lose their cool on the air. In other words, it was an intellectual show but also a lively show. It provokes a bit of nostalgia to revisit this kind of TV, given the current climate of loud and obnoxious cable news arguments, with sound bites getting more airtime than careful discussion.

The nostalgia is warranted, but we should not over-romanticize TV history. The fact is, “Firing Line” was not typical. In the pre-cable days, most public affairs shows were deadly dull, news broadcasts assiduously avoided controversy, and Buckley’s show highlighted intellectuals in a way that was unique.

So it’s not so much that political discussion on TV used to be so much better than it is today, but that there used to be one show that really nailed the best way to discuss politics, and now there are (arguably) no such shows. The situation has spiraled since 2016, but it wasn’t great before then. Could we have a successful version of “Firing Line” today? Margaret Hoover rebooted the show on PBS, with some success, though it doesn’t hit Buckley’s intellectual high notes in the same way.

The bottom line is, you can’t have a show exactly like “Firing Line” because Buckley was such a unique personality. Also, in today’s niche media environment, people don’t all watch the same shows like they used to, and it’s hard to stand out with a new program and turn a profit. Furthermore, TV is expensive. I think podcasts are the future (and the present, for that matter) in terms of making room for smart, spirited political discussion. But it remains hard for them to reach a broad audience holding varying political beliefs — hard to get beyond the echo chamber.

Q: Your current research project examines the media coverage of the 1968 Democratic National Convention where, you contend, conservatives’ distrust of the news media began to take root. Today, we have conservative leadership responding to nearly any news coverage that they do not like as “fake news.” Are these responses to news coverage similar because these are two similarly tumultuous times in our history, or has our inability to have thoughtful dialogue dissipated these past 50 years?

A:  We have to be careful how we use the word “conservative.” It means different things over time, and we would not all agree on what it means now. Many people who identify as conservatives have left the GOP, because they are disturbed by President Trump’s populism, demeanor, and lack of coherent policy objectives. In a recent Atlantic essay, for example, political commentator George Will is quoted saying that Trump has not “made a contribution to our understanding of conservatism.”

From this perspective, it is Trump and his populist base, not conservatives per se, who call news they don’t like “fake news.” I’m a liberal and have no investment in arguing for the integrity of some pure version of conservatism, but separating families and putting them in detention camps and choosing not to protect our elections from foreign interference do not strike me as “conservative” actions, per se.

That said, attacking the media for “liberal bias” is a familiar conservative tactic. The notion preceded the 1968 Democratic National Convention, but at that point it dominated among segregationists in the Deep South who objected to national news coverage of the civil rights movement. (David Greenberg wrote the definitive essay on this.)

What is unique about Chicago is that it was a moment when the idea that the media was unfair was nationalized: Viewers across America saw TV images of Chicago police beating protesters in the streets, and, in effect, said what they were seeing was not reality, that journalists had chosen not to show the violence of the protesters themselves, and that a more balanced picture would have revealed that police behaved appropriately.

Viewers sent angry telegrams to CBS at 2 a.m., just moments after the network signed off during the convention, and letters to CBS in the weeks following the convention ran 11-to-1 against CBS. Viewers attacked NBC too, but less ABC, which did not air complete convention coverage. Congress commissioned an impartial study that concluded the protesters had sometimes been violent in Chicago, but that what had happened there was a “police riot” in which protesters, journalists, and even passers-by were beaten bloody by cops, many of whom were out of control. The study concluded by releasing an impressive 350 page report.

One takeaway is the obvious point that it is live pictures on TV that resonate most strongly with people, not later reasoned discussions of those images. Other big takeaways for me: This particular attack on the networks was “organic;” it wasn’t organized. And it came from people who self-identified as both conservatives and liberals. Nixon’s genius was to tap into that spontaneous hostile energy and actively, strategically cultivate the idea of liberal media bias. This is one way to trace the lineage of Trump’s “fake news” accusations, though it’s just one piece of the puzzle.

Q: If the general public sees mainstream media outlets as blatantly biased, our current polarization will continue. Do you see this intolerance reflected among your students?

A: I don’t have a master plan to solve these problems, which I agree are grave, but I do think it is helpful to teach people about the history of journalism so they understand how notions of bias and objectivity have played out over time. A highly readable book on this is Michael Schudson’s “Discovering the News: A Social History of American Newspapers.” More recently, there is Matthew Pressman’s book “On Press: The Liberal Values that Shaped the News.”

The crux of this question may be the whole notion of “mainstream media outlets.” What does that mean today? There have long been journals of opinion, such as The Nation on the left and National Review on the right, with remaining journalism focused on a mass readership assumed to be a mix of liberal and conservatives. Today, opinion seems louder than reporting, and people gravitate to multiple niche outlets that support what they already believe.

What does “mainstream” mean in this context? It means, in part, The New York Times, The Washington Post, The Wall Street Journal, the Chicago Tribune. These are all outlets that may sometimes exhibit a bias — The Wall Street Journal leans conservative and The New York Times is more centrist. Still, we must adamantly insist that these publications, though they may sometimes frame stories in ways we do not care for, are not simply making things up. Education helps with this, but it’s not going to get through to everyone encompassed by the phrase “the general public.” Belief in the mendacity of mainstream media to many is precisely that, belief. Like religion, it is unfalsifiable.

I take encouragement from my MIT students, who are so consistently thoughtful about these issues. I teach a course in science fiction, for example, and much of it centers on how we use allegory and other kinds of narrative to think through political crises and strategize for a better world. Most of my students have a technical or scientific orientation, so they tend to take a very rational approach to thinking through arguments. Sometimes we hit a very interesting brick wall when we deal with science fiction texts that are as much about affect as argumentation. How do you argue about feelings, which are simply not empirical in the same way that certain facts are?

Often, it is history that helps us sort things out. I teach “The Handmaid’s Tale,” both the novel and the TV show, for example, and have written about the show. You can’t sort out all the emotional layers of the novel without a deep history lesson on the politics of the Reagan years, the Meese Commission, Women Against Pornography, Take Back the Night marches, and so on. Our starting point is viewing this ad, and then this ad.

Time and time again, I find in the classroom it is historical understanding that helps us sort through ways of understanding contemporary issues, even if we cannot come up with easy solutions.

Story prepared by MIT SHASS Communications
Editorial team: Emily Hiestand and Maria Iacobo

Letter regarding the Schwarzman College of Computing Task Force update

The following letter was sent to the MIT community on August 15 by Provost Martin A. Schmidt.

To the members of the MIT community:

I write today to update you on the work of the MIT Stephen A. Schwarzman College of Computing Task Force and the status of the College. The comment period for the task force working group reports has ended, and the final versions and executive summary of the reports are now available, along with a summary of the comments received. I am deeply grateful to everyone who so vigorously engaged in the process—as a member of a working group, as an active participant in one of the community forums, or as a contributor to our web-based idea bank.

The working groups had a number of excellent ideas and provided us with a broad range of perspectives. Moving forward, I will be working with our new dean of the College, Dan Huttenlocher, and the School deans to develop implementation plans for the College.

In the near term, we will need to focus on four items. First, we need to define the status of Electrical Engineering and Computer Science (EECS) faculty in the College. Dan is working closely with the EECS leadership and Dean of Engineering Anantha Chandrakasan on this, with their thinking strongly informed by the ideas of the Organizational Structure Working Group. Similarly, Dan and the Institute for Data, Systems, and Society (IDSS) leadership are working to define the status of IDSS faculty in the College. Second, the work of the Faculty Appointments Working Group has evolved the concept of “bridge faculty,” and Dan is working with the school deans to further advance the “cluster” concept of these faculty appointments. Third, we need to define the details of how best to integrate teaching and research on the societal implications of computing into the fabric of the College.

Finally, I would like to create an ongoing advisory mechanism to facilitate input from the MIT community. The process of establishing the College has benefited greatly from broad community engagement. In that spirit, we will endeavor to share regular updates on the College’s status and will communicate means for the community to continue to share their thoughts.

Sincerely,

Martin A. Schmidt

Proudly powered by WordPress
Theme: Esquire by Matthew Buchanan.