People Should Find A Safe Storm Shelter During Thunderstorm

Storm Shelters in OKC

Tuesday June 5, 2001 marked the start of an extremely fascinating time in the annals of my cherished Houston. Tropical storm Allison, that early summer daytime came to see. The thunderstorm went rapidly, although there was Tuesday. Friday, afterward arrived, and Allison returned. This time going slowly, this time in the north. The thunderstorm became still. Thousands of people driven from their houses. Only when they might be desired most, several leading hospitals shut. Dozens of important surface roads, and every important highway covered in water that was high.

Yet even prior to the rain stopped, service to others, and narratives of Christian compassion started to be composed. For a couples class, about 75 people had assembled at Lakewood Church among the greatest nondenominational churches in The United States. From time they got ready to depart the waters had climbed so high they were stranded. The facility of Lakewood stayed dry and high at the center of among the hardest hit parts of town. Refugees in the powerful thunderstorm started arriving at their doorstep. Without no advance preparation, and demand of official sanction, those 75 classmates started a calamity shelter that grew to hold over 3,000 customers. The greatest of over 30 refuges that could be established in the height of the thunderstorm.

Where help was doled out to those who’d suffered losses after Lakewood functioned as a Red Cross Service Center. When it became clear that FEMA aid, and Red Cross wouldn’t bring aid enough, Lakewood and Second Baptist joined -Houston to produce an adopt a family plan to greatly help get folks on their feet quicker. In the occasions that followed militaries of Christians arrived in both churches. From all over town, people of economical standing, race, and each and every denomination collected. Wet rotted carpeting were pulled up, sheet stone removed. Piles of clothes donated food and bed clothes were doled out. Elbow grease and cleaning equipment were used to start eliminating traces of the damage.

It would have been an excellent example of practical ministry in a period of disaster, in the event the story stopped here, but it continues. A great many other churches functioned as shelters as well as in the occasions that followed Red Cross Service Centers. Tons of new volunteers, a lot of them Christians put to work, and were put through accelerated training. That Saturday, I used to be trapped in my own, personal subdivision. Particular that my family was safe because I worked in Storm Shelters OKC that was near where I used to live. What they wouldn’t permit the storm to do, is take their demand to give their religion, or their self respect. I saw so a lot of people as they brought gifts of food, clothes and bedclothes, praising the Lord. I saw young kids coming making use of their parents to not give new, rarely used toys to kids who had none.

Leaning On God Through Hard Times

Unity Church of Christianity from a location across town impacted by the storm sent a sizable way to obtain bedding as well as other supplies. A tiny troupe of musicians and Christian clowns requested to be permitted to amuse the kids in the shelter where I served and arrived. We of course promptly taken their offer. The kids were collected by them in a sizable empty space of flooring. They sang, they told stories, balloon animals were made by them. The kids, frightened, at least briefly displaced laughed.

When not occupied elsewhere I did lots of listening. I listened to survivors that were disappointed, and frustrated relief workers. I listened to kids make an effort to take advantage of a scenario they could not comprehend. All these are only the stories I have heard or seen. I am aware that spiritual groups, Churches, and lots of other individual Christians functioned admirably. I do need to thank them for the attempts in disaster. I thank The Lord for supplying them to serve.

I didn’t write its individuals, or this which means you’d feel sorry for Houston. As this disaster unfolded yet what I saw encouraged my beliefs the Lord will provide through our brothers and sisters in religion for us. Regardless how awful your community hits, you the individual Christian can be a part of the remedy. Those blankets you can probably never use, and have stored away mean much to people who have none. You are able to help in the event that you can drive. You are able to help if you’re able to create a cot. It is possible to help in the event that you can scrub a wall. It is possible to help if all you are able to do is sit and listen. Large catastrophes like Allison get lots of focus. However a disaster can come in virtually any size. That is a serious disaster to your family that called it home in case a single household burns. It is going to be generations prior to the folks here forget Allison.

United States Oil and Gas Exploration Opportunities

Firms investing in this sector can research, develop and create, as well as appreciate the edges of a global gas and oil portfolio with no political and economical disadvantages. Allowing regime and the US financial conditions is rated amongst the world and the petroleum made in US is sold at costs that were international. The firms will likely gain as US also has a national market that is booming. Where 500 exploration wells are drilled most of the petroleum exploration in US continues to be concentrated around the Taranaki Basin. On the other hand, the US sedimentary basins still remain unexplored and many show existence of petroleum seeps and arrangements were also unveiled by the investigation data with high hydrocarbon potential. There have already been onshore gas discoveries before including Great south river basins, East Coast Basin and offshore Canterbury.

As interest in petroleum is expected to grow strongly during this interval but this doesn’t automatically dim the bright future expectations in this sector. The interest in petroleum is anticipated to reach 338 PJ per annum. The US government is eager to augment the gas and oil supply. As new discoveries in this sector are required to carry through the national demand at the same time as raise the amount of self reliance and minimize the cost on imports of petroleum the Gas and Oil exploration sector is thought to be among the dawn sectors. The US government has invented a distinctive approach to reach its petroleum and gas exploration targets. It’s developed a “Benefit For Attempt” model for Petroleum and Gas exploration tasks in US.

The “Benefit For Attempt” in today’s analytic thinking is defined as oil reserves found per kilometer drilled. It will help in deriving the estimate of reservations drilled for dollar and each kilometer spent for each investigation. The authorities of US has revealed considerable signs that it’ll bring positive effects of change which will favor investigation of new oil reserves since the price of investigation has adverse effects on investigation task. The Authorities of US has made the information accessible about the oil potential in its study report. Foil of advice in royalty and allocation regimes, and simplicity of processes have enhanced the attractiveness of Petroleum and Natural Gas Sector in the United States.

Petroleum was the third biggest export earner in 2008 for US and the chance to to keep up the growth of the sector is broadly accessible by manners of investigation endeavors that are new. The government is poised to keep the impetus in this sector. Now many firms are active with new exploration jobs in the Challenger Plateau of the United States, Northland East Slope Basin region, outer Taranaki Basin, and Bellona Trough region. The 89 Energy oil and gas sector guarantees foreign investors as government to high increase has declared a five year continuance of an exemption for offshore petroleum and gas exploration in its 2009 budget. The authorities provide nonresident rig operators with tax breaks.

Modern Robot Duct Cleaning Uses

AC systems, and heat, venting collect pollutants and contaminants like mold, debris, dust and bacteria that can have an adverse impact on indoor air quality. Most folks are at present aware that indoor air pollution could be a health concern and increased visibility has been thus gained by the area. Studies have also suggested cleaning their efficacy enhances and is contributory to a longer operating life, along with maintenance and energy cost savings. The cleaning of the parts of forced air systems of heat, venting and cooling system is what’s called duct cleaning. Robots are an advantageous tool raising the price and efficacy facets of the procedure. Therefore, using modern robot duct isn’t any longer a new practice.

A cleaner, healthier indoor environment is created by a clean air duct system which lowers energy prices and increases efficiency. As we spend more hours inside air duct cleaning has become an important variable in the cleaning sector. Indoor pollutant levels can increase. Health effects can show years or up immediately after repeated or long exposure. These effects range from some respiratory diseases, cardiovascular disease, and cancer that can be deadly or debilitating. Therefore, it’s wise to ensure indoor air quality isn’t endangered inside buildings. Dangerous pollutants that can found in inside can transcend outdoor air pollutants in accordance with the Environmental Protection Agency.

Duct cleaning from Air Duct Cleaning Edmond professionals removes microbial contaminants, that might not be visible to the naked eye together with both observable contaminants. Indoor air quality cans impact and present a health hazard. Air ducts can be host to a number of health hazard microbial agents. Legionnaires Disease is one malaise that’s got public notice as our modern surroundings supports the development of the bacteria that has the potential to cause outbreaks and causes the affliction. Typical disorder-causing surroundings contain wetness producing gear such as those in air conditioned buildings with cooling towers that are badly maintained. In summary, in building and designing systems to control our surroundings, we’ve created conditions that were perfect . Those systems must be correctly tracked and preserved. That’s the secret to controlling this disorder.

Robots allow for the occupation while saving workers from exposure to be done faster. Signs of the technological progress in the duct cleaning business is apparent in the variety of gear now available for example, array of robotic gear, to be used in air duct cleaning. Robots are priceless in hard to reach places. Robots used to see states inside the duct, now may be used for spraying, cleaning and sampling procedures. The remote controlled robotic gear can be fitted with practical and fastener characteristics to reach many different use functions.

Video recorders and a closed circuit television camera system can be attached to the robotic gear to view states and operations and for documentation purposes. Inside ducts are inspected by review apparatus in the robot. Robots traveling to particular sections of the system and can move around barriers. Some join functions that empower cleaning operation and instruction manual and fit into little ducts. An useful view range can be delivered by them with models delivering disinfection, cleaning, review, coating and sealing abilities economically.

The remote controlled robotic gear comes in various sizes and shapes for different uses. Of robotic video cameras the first use was in the 80s to record states inside the duct. Robotic cleaning systems have a lot more uses. These devices provide improved accessibility for better cleaning and reduce labor costs. Lately, functions have been expanded by areas for the use of small mobile robots in the service industries, including uses for review and duct cleaning.

More improvements are being considered to make a tool that was productive even more effective. If you determine to have your ventilation, heat and cooling system cleaned, it’s important to make sure all parts of the system clean and is qualified to achieve this. Failure to clean one part of a contaminated system can lead to re-contamination of the entire system.

When To Call A DWI Attorney

Charges or fees against a DWI offender need a legal Sugar Land criminal defense attorney that is qualified dismiss or so that you can reduce charges or the fees. So, undoubtedly a DWI attorney is needed by everyone. Even if it’s a first-time violation the penalties can be severe being represented by a DWI attorney that is qualified is vitally significant. If you’re facing following charges for DWI subsequently the punishments can contain felony charges and be severe. Locating an excellent attorney is thus a job you should approach when possible.

So you must bear in mind that you just should hire a DWI attorney who practices within the state where the violation occurred every state within America will make its laws and legislation regarding DWI violations. It is because they are going to have the knowledge and expertise of state law that is relevant to sufficiently defend you and will be knowledgeable about the processes and evaluations performed to establish your guilt.

As your attorney they are going to look to the evaluations that have been completed at the time of your arrest and the authorities evidence that is accompanying to assess whether or not these evaluations were accurately performed, carried out by competent staff and if the right processes where followed. It isn’t often that a police testimony is asserted against, although authorities testimony also can be challenged in court.

You should attempt to locate someone who specializes in these kind of cases when you start trying to find a DWI attorney. Whilst many attorneys may be willing to consider on your case, a lawyer who specializes in these cases is required by the skilled knowledge needed to interpret the scientific and medical evaluations ran when you had been detained. The first consultation is free and provides you with the chance to to inquire further about their experience in fees and these cases.

Many attorneys will work according into a fee that is hourly or on a set fee basis determined by the kind of case. You may find how they have been paid to satisfy your financial situation and you will have the capacity to negotiate the conditions of their fee. If you are unable to afford to hire an attorney that is private you then can request a court-appointed attorney paid for by the state. Before you hire a DWI attorney you should make sure when you might be expected to appear in court and you understand the precise charges imposed against you.

How Credit Card Works

The credit card is making your life more easy, supplying an amazing set of options. The credit card is a retail trade settlement; a credit system worked through the little plastic card which bears its name. Regulated by ISO 7810 defines credit cards the actual card itself consistently chooses the same structure, size and contour. A strip of a special stuff on the card (the substance resembles the floppy disk or a magnetic group) is saving all the necessary data. This magnetic strip enables the credit card’s validation. The layout has become an important variable; an enticing credit card layout is essential in ensuring advice and its dependability keeping properties.

A credit card is supplied to the user just after a bank approves an account, estimating a varied variety of variables to ascertain fiscal dependability. This bank is the credit supplier. When a purchase is being made by an individual, he must sign a receipt to verify the trade. There are the card details, and the amount of cash to be paid. You can find many shops that take electronic authority for the credit cards and use cloud tokenization for authorization. Nearly all verification are made using a digital verification system; it enables assessing the card is not invalid. If the customer has enough cash to insure the purchase he could be attempting to make staying on his credit limit any retailer may also check.

As the credit supplier, it is as much as the banks to keep the user informed of his statement. They typically send monthly statements detailing each trade procedures through the outstanding fees, the card and the sums owed. This enables the cardholder to ensure all the payments are right, and to discover mistakes or fraudulent action to dispute. Interest is typically charging and establishes a minimal repayment amount by the end of the following billing cycle.

The precise way the interest is charged is normally set within an initial understanding. On the rear of the credit card statement these elements are specified by the supplier. Generally, the credit card is an easy type of revolving credit from one month to another. It can also be a classy financial instrument, having many balance sections to afford a greater extent for credit management. Interest rates may also be not the same as one card to another. The credit card promotion services are using some appealing incentives find some new ones along the way and to keep their customers.

Why Get Help From A Property Management?

One solution while removing much of the anxiety, to have the revenue of your rental home would be to engage and contact property management in Oklahoma City, Oklahoma. If you wish to know more and are considering the product please browse the remainder of the post. Leasing out your bit of real property may be real cash-cow as many landlords understand, but that cash flow usually includes a tremendous concern. Night phones from tenants that have the trouble of marketing the house if you own an emptiness just take out lots of the pleasure of earning money off of leases, overdue lease payments which you must chase down, as well as over-flowing lavatories. One solution while removing much of the anxiety, to have the earnings would be to engage a property management organization.

These businesses perform as the go between for the tenant as well as you. The tenant will not actually need to understand who you’re when you hire a property management company. The company manages the day to day while you still possess the ability to help make the final judgements in regards to the home relationships using the tenant. The company may manage the marketing for you personally, for those who are in possession of a unit that is vacant. Since the company is going to have more connections in a bigger market than you’ve got along with the industry than you are doing, you’ll discover your device gets stuffed a whole lot more quickly making use of their aid. In addition, the property management company may care for testing prospective tenants. With regards to the arrangement you’ve got, you might nevertheless not be unable to get the last say regarding if a tenant is qualified for the the system, but of locating a suitable tenant, the day-to-day difficulty is not any longer your problem. They’ll also manage the before-move-in the reviews as well as reviews required following a tenant moves away.

It is possible to step back watching the profits, after the the system is stuffed. Communicating will be handled by the company with all the tenant if you have an issue. You won’t be telephoned if this pipe explosions at the center of the night time. Your consultant is called by the tenant in the company, who then makes the preparations that are required to get the issue repaired with a care supplier. You get a phone call a day later or may not know there was an issue before you register using the business. The property management organization may also make your leasing obligations to to get. The company will do what’s required to accumulate if your tenant is making a payment. In certain arrangements, the organization is going to also take-over paying taxation, insurance, and the mortgage on the portion of property. You actually need to do-nothing but appreciate after after all the the invoices are paid, the revenue which is sent your way.

With all the advantages, you’re probably questioning exactly what to employing a property management organization, the downside should be. From hiring one the primary variable that stops some landlords is the price. All these providers will be paid for by you. The price must be weighed by you from the time frame you’ll save time that you may subsequently use to follow additional revenue-producing efforts or just take pleasure in the fruits of your expense work.

Benifits From An Orthodontic Care

Orthodontics is the specialty of dentistry centered on the identification and treatment of dental and related facial problems. The outcomes of Norman Orthodontist OKC treatment could be dramatic — an advanced quality of life for a lot of individuals of ages and lovely grins, improved oral health health, aesthetics and increased cosmetic tranquility. Whether into a look dentistry attention is needed or not is an individual’s own choice. Situations are tolerated by most folks like totally various kinds of bite issues or over bites and don’t get treated. Nevertheless, a number people sense guaranteed with teeth that are correctly aligned, appealing and simpler. Dentistry attention may enhance construct and appearance power. It jointly might work with you consult with clearness or to gnaw on greater.

Orthodontic attention isn’t only decorative in character. It might also gain long term oral health health. Right, correctly aligned teeth is not more difficult to floss and clean. This may ease and decrease the risk of rot. It may also quit periodontists irritation that problems gums. Periodontists might finish in disease, that occurs once micro-organism bunch round your house where the teeth and the gums meet. Periodontists can be ended in by untreated periodontists. Such an unhealthiness result in enamel reduction and may ruin bone that surrounds the teeth. Less may be chewed by people who have stings that are harmful with efficacy. A few of us using a serious bite down side might have difficulties obtaining enough nutrients. Once the teeth aren’t aimed correctly, this somewhat might happen. Morsel issues that are repairing may allow it to be more easy to chew and digest meals.

One may also have language problems, when the top and lower front teeth do not arrange right. All these are fixed through therapy, occasionally combined with medical help. Eventually, remedy may ease to avoid early use of rear areas. Your teeth grow to an unlikely quantity of pressure, as you chew down. In case your top teeth do not match it’ll trigger your teeth that are back to degrade. The most frequently encountered type of therapy is the braces (or retainer) and head-gear. But, a lot people complain about suffering with this technique that, unfortunately, is also unavoidable. Sport braces damages, as well as additional individuals have problem in talking. Dental practitioners, though, state several days can be normally disappeared throughout by the hurting. Occasionally annoyance is caused by them. In the event that you’d like to to quit more unpleasant senses, fresh, soft and tedious food must be avoided by you. In addition, tend not to take your braces away unless the medical professional claims so.

It is advised which you just observe your medical professional often for medical examinations to prevent choice possible problems that may appear while getting therapy. You are going to be approved using a specific dental hygiene, if necessary. Dental specialist may look-out of managing and id malocclusion now. Orthodontia – the main specialization of medication – mainly targets repairing chin problems and teeth, your grin as well as thus your sting. Dentist, however, won’t only do chin remedies and crisis teeth. They also handle tender to severe dental circumstances which may grow to states that are risky. You actually have not got to quantify throughout a predicament your life all. See dental specialist San – Direction Posts, and you’ll notice only but of stunning your smile plenty will soon be.

Nine universities team up to create global infrastructure for digital academic credentials

While digital technology has started to transform education by enabling new learning pathways that are customized to each individual’s needs, the way that educational institutions issue and manage academic credentials has not changed much. Nine leading universities announced that they have formed the Digital Credentials collaboration in order to create a trusted, distributed, and shared infrastructure standard for issuing, storing, displaying, and verifying academic credentials.

“Currently, those who successfully complete a degree from an institution must go back to that institution — sometimes by mail or even in person — each time there is a need to verify the academic credentials earned,” says Sanjay Sarma, MIT vice president for open learning. “This can be a complicated problem, especially if the learner no longer has access to the university. Such is the case with many refugees, immigrants, and displaced populations.” 

The universities working on this effort include Delft University of Technology, the Netherlands; Harvard University Division of Continuing Education; the Hasso Plattner Institute, University of Potsdam, Germany; Massachusetts Institute of Technology; Tecnologico de Monterrey, Mexico; Technical University of Munich, Germany; University of California, Berkeley; University of California, Irvine; and the University of Toronto, Canada. 

“As teaching and learning offered by our universities has come to encompass digital platforms, and as each of our learners have gained the power to shape their own educational trajectory over a lifetime, the question of trusted verification and authentication of learning and credentials poses itself with broad urgency,” says Diana Wu, dean of university extension and new academic ventures at UC Berkeley.

Using technology that relies on strong cryptography to prevent tampering and fraud, and shared ledgers to create a global infrastructure for anchoring academic achievements, the researchers plan to build upon earlier research and pioneering efforts by their institutions — including MIT’s pilot program for issuing all of its graduates a digital version of their diploma that is verified against a blockchain. 

One of the driving forces behind this shared effort is the interest by universities to utilize the advances brought by these new technologies in a way that prioritizes the needs of learners. Digital credentials allow learners to maintain a compelling and verifiable digital record of their lifelong learning achievements that may include badges, internships, bootcamps, certificates, MicroMasters (graduate-level courses), and stackable combinations thereof, as well as traditional degrees — all of which they can easily share with employers or other institutions. Institutions can record and manage the achievements of their learners in a way that is easy, safe, and inexpensive, and minimizes the risk of identity fraud.

“We are well-positioned in academia to use cutting-edge technology to empower learners to advance their careers and education with credentials in the palms of their hands,” says Hans Pongratz, senior vice president for IT systems and services at Technical University of Munich.

The team has now set their sights on the evolution and governance of a shared standard. “Digital credentials are like tokens of social and human capital and hold tremendous value for the individual. The crucial opportunity we have today is to bring together institutions that share a commitment to the benefit of learners, and who can act as stewards of this infrastructure,” says Philipp Schmidt, director of learning innovation at the MIT Media Lab. 

“Our shared vision is one where academic achievements, and the corresponding credentials that verify them, can open up new pathways for individuals to become who they want to be in the future,” says José Escamilla, director of TecLabs Learning Reimagined at Tecnologico de Monterrey.

To learn more about this project, visit digitalcredentials.mit.edu.

J-PAL North America’s newest initiative explores the work of the future

“The future of work will be determined by who yields power and for what purposes. We are in a moment of great transition — we have an opportunity to imagine what a new social contract can be,” said Sarita Gupta, executive director of Jobs with Justice and co-director of Caring Across Generations as she kicked off last Friday’s launch of J-PAL North America’s Work of the Future Initiative.

Gupta opened the confercence with a powerful call to action for participants to shift the narrative around “the future of work.” The newest initiative from J-PAL North America, a research center in MIT’s Department of Economics, the Work of the Future Initiative seeks to identify effective, evidence-based strategies that increase opportunity, reduce disparities, and help all workers navigate the work of the future.

Millions of workers throughout the industrialized and developing worlds could be affected by automation, rising inequality, stagnating educational attainment, and other labor market trends in the coming decades. Many workers lack access to jobs that pay living wages, have jobs with insufficient benefits or protections, or lack the necessary skills or education to progress within their industries in the face of technological change.

By spurring research on effective ways to help workers thrive in today’s changing labor market, the Work of the Future Initiative aims to center worker voices and create a more equitable future of work. The conference addressed a number of big questions, including: How can the future of work be made more equitable, efficient, and just?

“J-PAL North America’s Work of the Future Initiative was launched to catalyze rigorous research on these urgent questions,” explained David Autor, the Ford Professor of Economics at MIT and co-chair of the new initiative.

Autor also serves as vice-chair of the Institute’s complementary Work of the Future Task Force, a recently-launched group of MIT faculty and researchers exploring how emerging technologies are changing the nature of human work and what types of education and skills will enable humans to thrive in the digital economy.

The Initiative’s academic leadership, including Autor, co-chairs Matthew Notowidigdo of Northwestern University, and J-PAL Scientific Director Lawrence Katz of Harvard University, recognized that across the country, policymakers, industry leaders, and social service providers are actively seeking solutions to labor market challenges.

Many well-intentioned, potentially effective ideas remain untested, however, leaving policymakers without the necessary evidence to assess what will be helpful, neutral, or harmful. Too often, academic researchers, government agencies, and nonprofit and industry leaders are working on these critical problems in isolation, and don’t have the time or resources to tap into each other’s expertise.

J-PAL’s newest initiative seeks to fill this gap by generating new research to help answer these important questions. It will catalyze this kind of rigorous, actionable evidence through an innovation competition model and a researcher-facing request for proposals (RFP). 

The innovation competition is currently accepting promising research proposals from practitioners across the country, and will work with selected partners to develop a feasible, rigorous evaluation of a program or policy focused on the future of work.

Selected applicants will receive technical support from J-PAL staff, flexible funding to get an evaluation off the ground, and access to J-PAL’s network of leading academic researchers to help them design and implement randomized evaluations of their programs.

Evelyn Diaz, president of Heartland Alliance and a panelist at the kick-off event, explained why this kind of rigorous evaluation is critical to an organization’s success. “There is a fear of failure about evaluation, and we need to change the narrative,” Diaz said. “The focus should instead be on how we are learning.”

Those seeking to learn more about the competition are encouraged to sign up for J-PAL’s informational webinar on June 26. Through the competition, along with a bi-annual, researcher-facing RFP, the initiative aims to generate actionable research on questions related to the future of work.

Meawhile, with conferences like last Friday’s kick-off event, the initiative will also serve as a convener to bring together leading voices in the future of work space. At the kick-off, participants from academic institutions, nonprofits, philanthropies, and the private sector gathered to share insights, learn from each other’s different experiences, and brainstorm solutions to complex research questions.  

Event highlights included a number of engaging, interdisciplinary panels on challenges and opportunities related to the work of the future.

Gupta and Katz, for example, participated in a lively discussion with Abigail Wozniak from the Federal Reserve Bank of Minneapolis on how to shift narratives around the future of work.

Later in the day, Notowidigdo presented key findings from his recent research agenda on the Work of the Future, co-authored by Autor and Northwestern University graduate student Anran Li, and an interdisciplinary panel of industry and nonprofit leaders and academic researchers provided thoughtful commentary on the research agenda.

Jed Kolko, chief economist at Indeed, echoed the review paper’s call for more rigorous research on these topics. “There is a lot of uncertainty about the effect of automation technology on employment. Setting up experiments that will be able to measure those effects is critical.”

David Autor also presented innovative research on how work has — and hasn’t — changed over time, and the implications of this research for worker well-being, and an interdisciplinary panel of researchers and practitioners discussed how they formed mutually beneficial research-practitioner partnerships.

To wrap up the day, J-PAL North America Executive Director Mary Ann Bates moderated a wide-ranging panel on the changing nature of work in the United States that included Katz, J-PAL affiliate Damon Jones, and Julie Gehrki, vice president of philanthropy at the Walmart Foundation.

Bates’ opening remarks on the motivating principle behind the initiative set the tone for the rest of the day’s discussions. “The reason why we care about these topics is because of people,” she said.

Can science writing be automated?

The work of a science writer, including this one, includes reading journal papers filled with specialized technical terminology, and figuring out how to explain their contents in language that readers without a scientific background can understand.

Now, a team of scientists at MIT and elsewhere has developed a neural network, a form of artificial intelligence (AI), that can do much the same thing, at least to a limited extent: It can read scientific papers and render a plain-English summary in a sentence or two.

Even in this limited form, such a neural network could be useful for helping editors, writers, and scientists scan a large number of papers to get a preliminary sense of what they’re about. But the approach the team developed could also find applications in a variety of other areas besides language processing, including machine translation and speech recognition.

The work is described in the journal Transactions of the Association for Computational Linguistics, in a paper by Rumen Dangovski and Li Jing, both MIT graduate students; Marin Soljačić, a professor of physics at MIT; Preslav Nakov, a senior scientist at the Qatar Computing Research Institute, HBKU; and Mićo Tatalović, a former Knight Science Journalism fellow at MIT and a former editor at New Scientist magazine.

From AI for physics to natural language

The work came about as a result of an unrelated project, which involved developing new artificial intelligence approaches based on neural networks, aimed at tackling certain thorny problems in physics. However, the researchers soon realized that the same approach could be used to address other difficult computational problems, including natural language processing, in ways that might outperform existing neural network systems.

“We have been doing various kinds of work in AI for a few years now,” Soljačić says. “We use AI to help with our research, basically to do physics better. And as we got to be  more familiar with AI, we would notice that every once in a while there is an opportunity to add to the field of AI because of something that we know from physics — a certain mathematical construct or a certain law in physics. We noticed that hey, if we use that, it could actually help with this or that particular AI algorithm.”

This approach could be useful in a variety of specific kinds of tasks, he says, but not all. “We can’t say this is useful for all of AI, but there are instances where we can use an insight from physics to improve on a given AI algorithm.”

Neural networks in general are an attempt to mimic the way humans learn certain new things: The computer examines many different examples and “learns” what the key underlying patterns are. Such systems are widely used for pattern recognition, such as learning to identify objects depicted in photos.

But neural networks in general have difficulty correlating information from a long string of data, such as is required in interpreting a research paper. Various tricks have been used to improve this capability, including techniques known as long short-term memory (LSTM) and gated recurrent units (GRU), but these still fall well short of what’s needed for real natural-language processing, the researchers say.

The team came up with an alternative system, which instead of being based on the multiplication of matrices, as most conventional neural networks are, is based on vectors rotating in a multidimensional space. The key concept is something they call a rotational unit of memory (RUM).

Essentially, the system represents each word in the text by a vector in multidimensional space — a line of a certain length pointing in a particular direction. Each subsequent word swings this vector in some direction, represented in a theoretical space that can ultimately have thousands of dimensions. At the end of the process, the final vector or set of vectors is translated back into its corresponding string of words.

“RUM helps neural networks to do two things very well,” Nakov says. “It helps them to remember better, and it enables them to recall information more accurately.”

After developing the RUM system to help with certain tough physics problems such as the behavior of light in complex engineered materials, “we realized one of the places where we thought this approach could be useful would be natural language processing,” says Soljačić,  recalling a conversation with Tatalović, who noted that such a tool would be useful for his work as an editor trying to decide which papers to write about. Tatalović was at the time exploring AI in science journalism as his Knight fellowship project.

“And so we tried a few natural language processing tasks on it,” Soljačić says. “One that we tried was summarizing articles, and that seems to be working quite well.”

The proof is in the reading

As an example, they fed the same research paper through a conventional LSTM-based neural network and through their RUM-based system. The resulting summaries were dramatically different.

The LSTM system yielded this highly repetitive and fairly technical summary: “Baylisascariasis,” kills mice, has endangered the allegheny woodrat and has caused disease like blindness or severe consequences. This infection, termed “baylisascariasis,” kills mice, has endangered the allegheny woodrat and has caused disease like blindness or severe consequences. This infection, termed “baylisascariasis,” kills mice, has endangered the allegheny woodrat.

Based on the same paper, the RUM system produced a much more readable summary, and one that did not include the needless repetition of phrases: Urban raccoons may infect people more than previously assumed. 7 percent of surveyed individuals tested positive for raccoon roundworm antibodies. Over 90 percent of raccoons in Santa Barbara play host to this parasite.

Already, the RUM-based system has been expanded so it can “read” through entire research papers, not just the abstracts, to produce a summary of their contents. The researchers have even tried using the system on their own research paper describing these findings — the paper that this news story is attempting to summarize.

Here is the new neural network’s summary: Researchers have developed a new representation process on the rotational unit of RUM, a recurrent memory that can be used to solve a broad spectrum of the neural revolution in natural language processing.

It may not be elegant prose, but it does at least hit the key points of information.

Çağlar Gülçehre, a research scientist at the British AI company Deepmind Technologies, who was not involved in this work, says this research tackles an important problem in neural networks, having to do with relating pieces of information that are widely separated in time or space. “This problem has been a very fundamental issue in AI due to the necessity to do reasoning over long time-delays in sequence-prediction tasks,” he says. “Although I do not think this paper completely solves this problem, it shows promising results on the long-term dependency tasks such as question-answering, text summarization, and associative recall.”

Gülçehre adds, “Since the experiments conducted and model proposed in this paper are released as open-source on Github, as a result many researchers will be interested in trying it on their own tasks. … To be more specific, potentially the approach proposed in this paper can have very high impact on the fields of natural language processing and reinforcement learning, where the long-term dependencies are very crucial.”

The research received support from the Army Research Office, the National Science Foundation, the MIT-SenseTime Alliance on Artificial Intelligence, and the Semiconductor Research Corporation. The team also had help from the Science Daily website, whose articles were used in training some of the AI models in this research.

Jump-starting the economy with science

In 1988, the U.S. federal government created a $3 billion, 15-year project to sequence the human genome. Not only did the project advance science, it hit the economic jackpot: In 2012, human genome sequencing accounted for an estimated 280,000 jobs, $19 billion in personal income, $3.9 billion in federal taxes, and $2.1 billion in state and local taxes. And all for a price of $2 per year per U.S. resident.

“It’s an incredible rate of return,” says MIT economist Simon Johnson.

It’s not just genomics that pays off. Every additional $10 million in public funding granted to the National Institutes of Health, according to one MIT study, on average produces 2.7 patents and an additional $30 million in value for the private-sector firms that own those patents. When it comes to military technology, each dollar in publicly funded R&D leads to another $2.50-$5.90 in private-sector investment.

In general, “Public investment in science has very big economic returns,” says Johnson, who is the Ronald A. Kurtz Professor of Entrepreneurship at the MIT Sloan School of Management.

Yet after a surge in science funding spurred by World War II, the U.S. has lowered its relative level of public investment in research and development — from about 2 percent of GDP in 1964 to under half of that today.

Reviving U.S. support of science and technology is one of the best ways we can generate economic growth, according to Johnson and his MIT economist colleague Jonathan Gruber, who is the Ford Professor of Economics in MIT’s Department of Economics. And now Johnson and Gruber make that case in a new book, “Jump-Starting America: How Breakthrough Science Can Revive Economic Growth and the American Dream,” published this month by PublicAffairs press.

In it, the two scholars contend that pumping up public investment in science would create not only overall growth but also better jobs throughout the economy, in an era when stagnating incomes have caused strain for a large swath of Americans.

“Good jobs are for MIT graduates, but they’re also for people who don’t finish college. They’re for people who drop out of high school,” says Johnson. “There’s a tremendous amount of anxiety across the country.”

Hello, Columbus

Indeed, spurring growth across the country is a key theme of “Jump-Starting America.” Technology-based growth in the U.S. has been focused in a few “superstar” cities, where high-end tech jobs have been accompanied by increased congestion and sky-high housing prices, forcing out the less well-off.

“The prosperity has been concentrated in some places where it’s become incredibly expensive to live and work,” Johnson says. That includes Silicon Valley, San Francisco, New York, Los Angeles, Seattle, the Washington area, and the Boston metro area.

And yet, Johnson and Gruber believe, the U.S. has scores of cities where the presence of universities combined with industrial know-how could produce more technology-based growth. Some already have: As the authors discuss in the book, Orlando, Florida, is a center for high-tech computer modeling and simulation, thanks to the convergence of federal investment, the growth of the University of Central Florida, and local backing of an adjacent research park that supports dozens of thriving enterprises.

The Orlando case is “a modern version of what once made America the most prosperous nation on Earth,” the authors write, and they believe it can be replicated widely.

“Let’s spread it around the country, to take advantage of where the talent is in the U.S., because there’s a lot of talent away from the coastal cities,” Johnson says.

“Jump-Starting America” even contains a list of 102 metropolitan areas the authors think are ripe for investment and growth, thanks to well-educated work forces and affordability, among other factors. At the top of the list are Pittsburgh, Rochester, and three cities in Ohio: Cincinnati, Columbus, and Cleveland.

The authors’ list does not include any California cities — where affordability is generally a problem — but they view the ranking as a conversation-starter, not the last word on the subject. The book’s website has an interactive feature where readers can tweak the criteria used to rank cities, and see the results.

“We’d like people to challenge us and say, maybe we should think of the criteria differently,” Johnson says. “Everyone should be thinking about what have we got in our region, what do we need to get, and what kind of investment would make the difference here.”

A dividend on your investment

“Jump-Starting America” has received praise from scholars and policy experts. Susan Athey, an economist at Stanford University, calls the book “brilliant” and says it “brings together economic history, urban economics, and the design of incentives to build an ambitious proposal” for growth. Jean Tirole, of the Toulouse School of Economics, says the book gives a boost to industrial policy, by showing “how the government can promote innovation while avoiding the classic pitfalls” of such interventions.

For their part, Johnson and Gruber readily acknowledge that public investment in R&D is just one component of long-term growth. Continued private-sector investment, they note, is vital as well. Still, the book does devote a chapter to the limits of private investment, including the short-term focus on returns that has led many firms to scale back their own R&D operations.

“We’re very pro-private sector,” Johnson says. “I’m a professor of entrepreneurship at Sloan, and I work a lot with entrepreneurs around the world and venture capitalists. They will tell you, quite frankly … their incentives are to make money relatively quickly, given their time horizons and what their investors want. As a result they are drawn to a few sectors, including information technology, and within that more software than hardware these days.”

As a sweetener for any program of public science investment, the authors also suggest that people should receive a kind of annual innovation dividend — a return on their tax dollars. In effect, this would be a scaled-up version of the dividend that, for instance, Alaskans receive on that state’s energy revenues.

That would be a departure from current U.S. policy, but ultimately, Johnson and Gruber say, a departure is what we need.

“We don’t find the existing policies from anyone compelling,” Johnson says. “So we wanted to put some ideas out there and really start a debate about those alternatives, including a return to a bigger investment in science and technology.”

Pros and Cons Of 3D and 4D Ultrasounds

We have heard about 2D ultrasound technology and it has been used for decades for pregnancy and other purposes. It is used to find out the overall health and well being of the unborn baby. It is something that could help in preventing the mother from also entering into a danger zone if the baby has not been formed properly. It also could be useful in finding out some problems with the unborn child and decide whether it is safe to carry on with the pregnancy or abort it. We have now moved further from the world of 2D and what we have today is 3D and 4D technologies. There is no doubt that 3D and 4D technologies have their own benefits as far as the quality of images and resolutions are concerned. 4D imaging, in particular, is very advanced and the images get updated on a continuous basis. Hence, you will be able to see all the minute movements of the unborn baby. It is certainly a fascinating and unforgettable experience for the mother and the father and also could help in identifying some problems with the unborn baby. Cleft palate is, for example, is a problem with many babies and the same can be identified and corrected with the help of these 3D and 4D imaging and ultrasound services in Fort Walton Beach. Let us now look at the pros and cons of the 4D scans which are now becoming a rage in various parts of the country.

Benefits

 There are some obvious benefits associated with 4D ultrasound imaging services. It is one of the best and sure ways of assessing the prenatal condition of the fetus because of the clear images that are available. It also gives a real-time look at the face of the fetus the movement of limbs and breathing and other such activities. Further prenatal neurodevelopment can also be judged much better in a 3D or 4D scan when compared to a 2D scan. This is a test that is performed during different periods of gestation and it shows the behavior of the fetus and also shows the development of the brain. It makes it possible to find out if the central nervous system is normal or whether there is some abnormality.

 Risks

 There is no doubt that anything that you do during pregnancy could pose a risk. However, we need to understand that the fetus is put to the same risk as it happens with a 2D ultrasound imaging when compared to 3D and 4D imaging. They use the same wavelengths as far as conducting the tests are concerned. The doctors therefore try and limit the test, not to more than 30 minutes at best.

 The Final Word

 There is no doubt that there is a regular demand for 2D, 3D, and 4D scanning facility because of the obvious advantages and benefits associated with it. It is now becoming a regular and integral part of the doctor’s evaluation of the size of the baby and overall health. Hence, if it is used carefully and without too much overexposure, there are reasons to believe that it could be helpful in giving more benefits and drawbacks. Yes, there is no doubt that 3D and 4D imaging services are more expensive when compared to 2D scan facilities.

Contact US:

Living Images

Address:
6 11th Ave Suite F2
Shalimar, FL
Phone: (850) 244-2883

Dress Codes For Women Golfers

There is no doubt that like all sports, golf also has some well-defined dress codes. There are many who might be of the opinion that the dress codes might be a bit tough and strict, especially amongst women. So, very actually does the truth lay hidden? It would be interesting to find out the same. We will try and have a look at the women’s golf apparel so that those who are keen on it could get the required information. it would also be pertinent to mention here that golf dresses over the years have grown and changed with times. Gone are the days, when the dresses were heavy, unwieldy and had covered almost the entire body.

 Today, without being overtly provocative, the focus on the golf dress codes for women is towards comfort, style and also focuses on being trendy and being in line with today’s fashion needs and requirements. The dress code while remaining uniform does have some changes as far as the individual golf organizers are concerned. There are some who are liberal about the dress habits for women golfers while others still continue to be rigid and conservative. However, there are some general rules pertaining to dresses for women golfers and it would be interesting to have a look at a few of them for the sake of our customers and other information seekers.

 Tops

In most golf courses, women are required to wear blouses. It could be both with sleeves and without sleeves. There is no major restriction as far as the colors of the tops are concerned. However, most women prefer wearing polo style shirts. These are extremely comfortable and convenient when it comes to taking shots and moving around in the golf arena. These polo type shirts are available in different colors and designs. They include v-neck, button down and ziptop amongst other styles. They are available both in short as well as long sleeves. Apart from simple colors, you also could come across women golfers wearing them in floral stripes and different types of patterns. You also have other options but there are some exceptions and they include halters, t-shirts, and tank tops.

Sweaters And Jackets

Dressing in layers is now becoming quite common for golfers. Most women golfers wear a sweater or even a vest over a turtleneck or polo shirt. This comes in very handy during a cold day when the barometer is down. It also is common to see them wearing collared button-down shirts and also wear a light golf jacket or even wind shirt. This is useful for additional coverage. However, denim and sweat jackets are not acceptable.

Bottoms

During early fall or springs, slacks are often considered to be the most commonly used dresses by women on the golf course. However, when the weather is warm, then women go in for capris and shorter slacks and shorts and crops could also be tried out. However, the shorter pants should at least be knee high or more failing which they might not be accepted.

Hence, there is no doubt that there are quite a few options and choices to make as far as women golf dresses are concerned.

Contact US:

FlirTee Golf

Address:
3601 NW 175th St
Edmond, OK
Phone: (405) 568-8944

MIT spinout seeks to transform food safety testing

“This is a $10 billion market and everyone knows it.” Those are the words of Chris Hartshorn, CEO of a new MIT spinout — Xibus Systems — that is aiming to make a splash in the food industry with their new food safety sensor.

Hartshorn has considerable experience supporting innovation in agriculture and food technology. Prior to joining Xibus, he served as chief technology officer for Callaghan Innovation, a New Zealand government agency. A large portion of the country’s economy relies upon agriculture and food, so a significant portion of the innovation activity there is focused on those sectors.

While there, Hartshorn came in contact with a number of different food safety sensing technologies that were already on the market, aiming to meet the needs of New Zealand producers and others around the globe. Yet, “every time there was a pathogen-based food recall” he says, “it shone a light on the fact that this problem has not yet been solved.” 

He saw innovators across the world trying to develop a better food pathogen sensor, but when Xibus Systems approached Hartshorn with an invitation to join as CEO, he saw something unique in their approach, and decided to accept.

Novel liquid particles provide quick indication of food contamination

Xibus Systems was formed in the fall of 2018 to bring a fast, easy, and affordable food safety sensing technology to food industry users and everyday consumers. The development of the technology, based on MIT research, was supported by two commercialization grants through the MIT Abdul Latif Jameel Water and Food Systems Lab’s J-WAFS Solutions program. It is based on specialized droplets — called Janus emulsions — that can be used to detect bacterial contamination in food. The use of Janus droplets to detect bacteria was developed by a research team led by Tim Swager, the John D. MacArthur Professor of Chemistry, and Alexander Klibanov, the Novartis Professor of Biological Engineering and Chemistry.

Swager and researchers in his lab originally developed the method for making Janus emulsions in 2015. Their idea was to create a synthetic particle that has the same dynamic qualities as the surface of living cells. 

The liquid droplets consist of two hemispheres of equal size, one made of a blue-tinted fluorocarbon and one made of a red-tinted hydrocarbon. The hemispheres are of different densities, which affects how they align and how opaque or transparent they appear when viewed from different angles. They are, in effect, lenses. What makes these micro-lenses particularly unique, however, is their ability to bind to specific bacterial proteins. Their binding properties enabled them to move, flipping from a red hemisphere to blue based on the presence or absence of a particular bacteria, like Salmonella.

“We were thrilled by the design,” Swager says. “It is a completely new sensing method that could really transform the food safety sensing market. It showed faster results than anything currently available on the market, and could still be produced at very low cost.”

Janus emulsions respond exceptionally quickly to contaminants and provide quantifiable results that are visible to the naked eye or can be read via a smartphone sensor. 

“The technology is rooted in very interesting science,” Hartshorn says. “What we are doing is marrying this scientific discovery to an engineered product that meets a genuine need and that consumers will actually adopt.”

Having already secured nearly $1 million in seed funding from a variety of sources, and also being accepted into Sprout, a highly respected agri-food accelerator, they are off to a fast start.

Solving a billion-dollar industry challenge

Why does speed matter? In the field of food safety testing, the standard practice is to culture food samples to see if harmful bacterial colonies form. This process can take many days, and often can only be performed offsite in a specialized lab.

While more rapid techniques exist, they are expensive and require specialized instruments — which are not widely available — and still typically require 24 hours or more from start to finish. In instances where there is a long delay between food sampling and contaminant detection, food products could have already reached consumers hands — and upset their stomachs. While the instances of illness and death that can occur from food-borne illness are alarming enough, there are other costs as well.  Food recalls result in tremendous waste, not only of the food products themselves but of the labor and resources involved in their growth, transportation, and processing. Food recalls also involve lost profit for the company. North America alone loses $5 billion annually in recalls, and that doesn’t count the indirect costs associated with the damage that occurs to particular brands, including market share losses that can last for years.

The food industry would benefit from a sensor that could provide fast and accurate readings of the presence and amount of bacterial contamination on-site. The Swager Group’s Janus emulsion technology has many of the elements required to meet this need and Xibus Systems is working to improve the speed, accuracy, and overall product design to ready the sensor for market.

Two other J-WAFS-funded researchers have helped improve the efficiency of early product designs. Mathias Kolle, assistant professor in the Department of Mechanical Engineering at MIT and recipient of a separate 2017 J-WAFS seed grant, is an expert on optical materials. In 2018, he and his graduate student Sara Nagelberg performed the calculations describing light’s interaction with the Janus particles so that Swager’s team could modify the design and improve performance. Kolle continues to be involved, serving with Swager on the technical advisory team for Xibus. 

This effort was a new direction for the Swager group. Says Swager: “The technology we originally developed was completely unprecedented. At the time that we applied to for a J-WAFS Solutions grant, we were working in new territory and had minimal preliminary results. At that time, we would have not made it through, for example,  government funding reviews which can be conservative. J-WAFS sponsorship of our project at this early stage was critical to help us to achieve the technology innovations that serve as the foundation of this new startup.”  

Xibus co-founder Kent Harvey — also a member of the original MIT research team—is joined by Matthias Oberli and Yuri Malinkevich. Together with Hartshorn they are working on a prototype for initial market entry. They are actually developing two different products: a smartphone sensor that is accessible to everyday consumers, and a portable handheld device that is more sensitive and would be suitable for industry. If they are able to build a successful platform that meets industry needs for affordability, accuracy, ease of use, and speed, they could apply that platform to any situation where a user would need to analyze organisms that live in water. This opens up many sectors in the life sciences, including water quality, soil sensing, veterinary diagnostics, as well as fluid diagnostics for the broader healthcare sector.    

The Xibus team wants to nail their product right off the bat.

“Since food safety sensing is a crowded field, you only get one shot to impress your potential customers,“ Hartshorn says. “If your first product is flawed or not interesting enough, it can be very hard to open the door with these customers again. So we need to be sure our prototype is a game-changer. That’s what’s keeping us awake at night.” 

The evolving definition of a gene

More than 50 years ago, scientists came up with a definition for the gene: a sequence of DNA that is copied into RNA, which is used as a blueprint for assembling a protein.

In recent years, however, with the discovery of ever more DNA sequences that play key roles in gene expression without being translated into proteins, this simple definition needed revision, according to Gerald Fink, the Margaret and Herman Sokol Professor in Biomedical Research and American Cancer Society Professor of Genetics in MIT’s Department of Biology.

Fink, a pioneer in the field of genetics, discussed the evolution of this definition during yesterday’s James R. Killian Jr. Faculty Achievement Award Lecture, titled, “What is a Gene?”

“In genetics, we’ve lost a simple definition of the gene — a definition that lasted over 50 years,” he said. “But loss of the definition has spawned whole new fields trying to understand the unknown information in non-protein-coding DNA.”

Established in 1971 to honor MIT’s 10th president, James Killian, the Killian Award recognizes extraordinary professional achievements by an MIT faculty member. Fink, who is also a member and former director of the Whitehead Institute, was honored for his achievements in developing brewer’s yeast as “the premier model for understanding the biology of eukaryotes” — organisms whose cells have nuclei.

“He is among the very few scientists who can be singularly credited with fundamentally changing the way we approach biological problems,” says the award citation, read by Susan Silbey, chair of the MIT faculty, who presented Fink with the award.

Genetic revolution

Growing in a “sleepy” town on Long Island, Fink had a keen interest in science, which spiked after the Soviets launched the first satellite to orbit the Earth.

“In 1957, when I went out in our backyard, I was hypnotized by the new star in the sky, as Sputnik slowly raced toward the horizon,” he said. “Overnight, science became a national priority, energized by the dread of Soviet technology and technological superiority.”

After earning his bachelor’s degree at Amherst College, Fink began studying yeast as a graduate student at Yale University, and in 1976, he developed a way to insert any DNA sequence into yeast cells.

This discovery transformed biomedical research by allowing scientists to program yeast to produce any protein they wanted, as long as they knew the DNA sequence of the gene that encoded it. It also proved industrially useful: More than half of all therapeutic insulin is now produced by yeast, along with many other drugs and vaccines, as well as biofuels such as ethanol.

At that time, scientists were operating with a straightforward definition of the gene, based on the “central dogma” of biology: DNA makes RNA, and RNA makes proteins. Therefore, a gene was defined as a sequence of DNA that could code for a protein. This was convenient because it allowed computers to be programmed to search the genome for genes by looking for specific DNA sequences bracketed by codons that indicate the starting and stopping points of a gene.

In recent decades, scientists have done just that, identifying about 20,000 protein-coding genes in the human genome. They have also discovered genetic mechanisms involved in thousands of human diseases. Using new tools such as CRISPR, which enables genome editing, cures for such diseases may soon be available, Fink believes.

“The definition of a gene as a DNA sequence that codes for a protein, coupled with the sequencing of the human genome, has revolutionized molecular medicine,” he said. “Genome sequencing, along with computational power to compare and analyze genomes, has led to important insights into basic science and disease.”

However, he pointed out, protein-coding genes account for just 2 percent of the entire human genome. What about the rest of it? Scientists have traditionally referred to the remaining 98 percent as “junk DNA” that has no useful function.

In the 1980s, Fink began to suspect that this junk DNA was not as useless as had been believed. He and others discovered that in yeast, certain segments of DNA could “jump” from one location to another, and that these segments appeared to regulate the expression of whatever genes were nearby. This phenomenon was later observed in human cells as well.

“That alerted me and others to the fact that ‘junk DNA’ might be making RNA but not proteins,” Fink said.

Since then, scientists have discovered many types of non-protein-coding RNA molecules, including microRNAs, which can block the production of proteins, and long non-coding RNAs (lncRNAs), which have many roles in gene regulation.

“In the last 15 years, it has been found that these are critical for controlling the gene expression of protein-coding genes,” Fink said. “We’re only now beginning to visualize the importance of this formerly invisible part of the genome.”

Such discoveries demonstrate that the traditional definition of a gene is inadequate to encompass all of the information stored in the genome, he said.

“The existence of these diverse classes of RNA is evidence that there is no single physical and functional unit of heredity that we can call the gene,” he said. “Rather, the genome contains many different categories of informational units, each of which may be considered a gene.”

“A community of scholars”

In selecting Fink for the Killian Award, the award committed also cited his contributions to the founding of the Whitehead Institute, which opened in 1982. At the time, forming a research institute that was part of MIT yet also its own entity was considered a “radical experiment,” Fink recalled.

Though controversial at the time, with heated debate among the faculty, establishing the Whitehead Institute laid the groundwork for many other research institutes that have been established at MIT, and also helped to attract biotechnology companies to the Kendall Square area, Fink said.

“As we now know, MIT made the right decision. The Whitehead turned out to be a successful pioneer experiment that in my opinion led to the blossoming of the Kendall Square area,” he said.

Fink was hired as one of the first faculty members of the Whitehead Institute, and served as its director from 1990 to 2001, when he oversaw the Whitehead’s contributions to the Human Genome Project. He recalled that throughout his career, he has collaborated extensively not only with other biologists, but with MIT colleagues in fields such as physics, chemical engineering, and electrical engineering and computer science.

“MIT is a community of scholars, and I was welcomed into the community,” he said.

Letter regarding the MIT Schwarzman College of Computing community forums in April

The following letter was sent to the MIT community on April 4 by Provost Martin A. Schmidt.

To the members of the MIT community:

I write to invite you to a series of three community forums with the MIT Stephen A. Schwarzman College of Computing Working Groups on April 17 and 18.

As you may recall, in February five working groups were charged with developing ideas and options for the MIT administration to consider in planning the structure and operation of the new College, with the intent of completing this work in May. The forums will provide updates on the working groups’ discussions to date, and give you an opportunity to share questions, thoughts, and ideas with the working groups co-chairs leading the discussions. 

Because some of the content that the working groups are exploring is overlapping, we have structured the forums as follows:

  1. Joint forum with the Social Implications and Responsibilities of Computing and Academic Degrees working groups
    Wednesday, April 17, 10:30 am–12:00 pm
    Kresge Little Theatre (W16-035)
     
  2. Joint forum with the Organizational Structure and Faculty Appointments working groups
    Wednesday, April 17, 1:00–2:30 pm
    Kresge Little Theatre (W16-035)
     
  3. Computing Infrastructure working group forum
    Thursday, April 18, 10:30 am–11:30 am
    Samberg Dining Rooms 3 and 4 (E52, 6th floor)

Please visit the MIT Schwarzman College of Computing Task Force website for more information on the working groups and to contribute your suggestions to the Idea Bank.

Your participation and input are important. The working groups and I hope that you will attend any or all of these forums.

I also want to note that, beyond these forums, we will be planning additional opportunities for community engagement going forward as we plan for the new College.

Sincerely,

Martin A. Schmidt

KSA meeting explores collaboration in 2019

Speakers at the Kendall Square Association (KSA) annual meeting yesterday considered the ways collaboration in our society is changing and gave updates on their work to strengthen the bonds between members of the local community.

Many organizations champion the concept of collaboration, but this year the KSA wanted to take a closer look at the influence modern technology has had on collaboration between people, organizations, and societies.

“We all talk about collaboration — it’s kind of obvious — but there are new ways to think about collaboration that can help us work more deeply with one another and get things done more quickly,” says Sarah Gallop, who serves as chair of the board at the KSA and co-director of MIT’s Office of Government and Community Relations. “We’re not talking about superficial collaboration here. We’re talking about intense collaboration, with depth, that allows us to do more by working together.”

The KSA is a nonprofit organization made up of industry, government, and academic officials, which seeks to build partnerships, host events, advocate for public policy issues, and tell the story of Kendall Square’s transformation. More than 200 people from a wide range of companies and organizations attended this year’s meeting, which was hosted by event sponsor Boston Marriot Cambridge.

The meeting also included talks on the KSA’s Diversity, Equity, and Inclusion Initiative; efforts to improve commuting options; and recent accomplishments in the bustling Kendall Square area that many at the meeting referred to as the most innovative square mile in the world.

Collaboration in 2019

The meeting’s keynote speaker was Jeremy Heimans, co-author of the recently released book “New Power: How Power Works in Our Hyperconnected World — and How to Make It Work for You”. In his talk, Heimans described the new power structures enabled by technology, citing examples such as the #MeToo movement’s effectiveness at holding previously untouchable company executives accountable for sexual misconduct, and the fact that Airbnb is currently more valuable than Hilton.

He described new power as current, open, and made by many people, and he challenged attendees to start thinking about when and where to turn to these new power dynamics in their organizations.

Gallop says the talk was a good way to help KSA members collectively develop new ways of thinking.

“It’s so important for us to learn together,” Gallop says. “Each year, we bring in a speaker that’s going to help us broaden our minds, broaden our thinking, and make us think about current issues. Jeremy is teaching us how to make things happen and how to accelerate our work even more.”

During a Q&A after his talk, Heimans told members of the audience that they have an important role to play in this new power landscape, defending science and reason in a society that’s less hierarchical than ever before and where knowledgeable voices can more easily get drowned out.

“Expertise is under assault right now,” Heimans said. “We need you. You’re already densely connected, so experiment with ways to further unleash that. What can you do to increase collaboration, tackle these issues, and spread what you do so well — innovation — around the world?”

A busy 10 years

The KSA has long been focused on the issues of connectivity, transportation, and diversity and inclusion. Gallop says those goals make MIT’s close partnership with the organization easy.

“I have always felt that the KSA and MIT have very similar values in terms of our missions around learning, being inclusive, and sharing information,” Gallop said. “For me, as an MIT employee and chair of the KSA board, there’s a lot of synergy between the two organizations that makes it a natural fit.”

Last year, the KSA assembled a Diversity and Inclusion Learning Community to get leaders brainstorming ways to promote diversity in the area. Since then, they’ve hired an expert facilitator to guide the group, and they are currently assembling their second cohort of leaders in the area.

The KSA also announced a series of design-thinking workshops and meetings to be held throughout April as part of its placemaking initiative. The KSA defines placemaking as “a collaborative process to better define, activate, and program public space.” To help with the initiative, the KSA assembled a working group of professionals with placemaking expertise that includes MIT faculty members, urban planners, real estate developers, and community and business leaders.

“We want to optimize conditions for connectivity and collaboration,” said Jesse Baerkahn, founder of urban design firm Graffito SP and a member of the KSA’s board of directors. “We’ll do that by injecting play, fun, and beauty into the area that reflects the magic and uniqueness that makes Kendall Square what it is today.”

Improving commuting times and safety for Kendall Square’s workers is another area where the KSA has been active over the last year. Members of the KSA meet regularly with the MBTA and the Massachusetts Department of Transportation, and have been working closely with other advocacy groups like T4MA and A Better City over the past year.

In October, the KSA launched the Transportation ADVANCE Initiative to engage the Kendall Square community through a series of experiments to find hyper-local solutions to improving workers’ commutes.

Of course, the KSA’s 10th anniversary also offered an opportunity to reflect on how much Kendall Square and the KSA have evolved over the years. The area’s past as an industrial wasteland has been well-documented, but attendees also discussed the more recent transformation of the last 10 years.

“Everywhere I go, people talk about how Kendall Square is changing, with new buildings, restaurants, gathering spaces, people, inventions, and companies,” Gallop said in her opening remarks. “Sometimes, I have to take a minute when I’m walking around because a building I swear wasn’t there last week is all of a sudden completed in the square. But Kendall has always been about change.”

In a manner similar to the area, Gallop says the KSA’s priorities are always evolving.

“Our priorities may change; the beauty of Kendall Square is that it’s always changing,” Gallop says. “If you come back in a few years there might be a new priority area — or three new ones.”

Proudly powered by WordPress
Theme: Esquire by Matthew Buchanan.