Engineering | MIT News

August 17, 2018

  • The American Physical Society (APS) has recognized MIT Plasma Science and Fusion Center (PSFC) principal research scientists John Wright and Stephen Wukitch, as well as Yevgen Kazakov and Jozef Ongena of the Laboratory for Plasma Physics in Brussels, Belgium, with the Landau-Spitzer Award for their collaborative work.

    Given biennially to acknowledge outstanding plasma physics collaborations between scientists in the U.S. and the European Union, the prize this year is being awarded “for experimental verification, through collaborative experiments, of a novel and highly efficient ion cyclotron resonance heating scenario for plasma heating and generation of energetic ions in magnetic fusion devices.”

    The collaboration originated at a presentation on a proposed heating scenario by Kazakov, given at a conference in 2015. Wright and Wukitch were confident that the MIT's Alcator C-Mod (the world’s highest-field tokamak) and the UK's JET (the world’s largest tokamak) would allow for an expedited and comprehensive experimental investigation. C-Mod’s high magnetic fields made it ideal for confining energetic ions, and its unique diagnostics allowed the physics to be verified within months of the conference. The results greatly strengthened Kazakov and Ongena's proposal for a JET experiment that conclusively demonstrated generation of energetic ions via this heating technique.

    Additional C-Mod experiments were the first to observe alpha-like energetic ions at high magnetic field and reactor-like densities. The joint experimental work highlighting JET and C-Mod results was published in Nature Physics.

    One of the key fusion challenges is confining the very energetic fusion product ions that must transfer their energy to the core plasma before they escape confinement. This heating scenario efficiently generates energies comparable to that of those produced by fusion and can be used to study energetic ion behavior in present day devices such as JET and the stellarator Wendelstein 7-X (W-7X). It will also allow study in the non-nuclear phase of ITER, the next-generation fusion device being built in France.

    “It will be the icing on the cake to use this scenario at W-7X,” says Wright. “Because stellarators have large volume and high-density plasmas, it is hard for current heating scenarios to achieve those fusion energies. With conventional techniques it has been difficult to show if stellarators can confine fast ions. Using this novel scenario will definitely allow researchers to demonstrate whether a stellarator will work for fusion plasmas.”

    The award, given jointly by APS and the European Physical Society, will be presented to the team in November at the APS Division of Plasma Physics meeting in Portland, Oregon.

  • A novel encryption method devised by MIT researchers secures data used in online neural networks, without dramatically slowing their runtimes. This approach holds promise for using cloud-based neural networks for medical-image analysis and other applications that use sensitive data.

    Outsourcing machine learning is a rising trend in industry. Major tech firms have launched cloud platforms that conduct computation-heavy tasks, such as, say, running data through a convolutional neural network (CNN) for image classification. Resource-strapped small businesses and other users can upload data to those services for a fee and get back results in several hours.

    But what if there are leaks of private data? In recent years, researchers have explored various secure-computation techniques to protect such sensitive data. But those methods have performance drawbacks that make neural network evaluation (testing and validating) sluggish — sometimes as much as million times slower — limiting their wider adoption.

    In a paper presented at this week’s USENIX Security Conference, MIT researchers describe a system that blends two conventional techniques — homomorphic encryption and garbled circuits — in a way that helps the networks run orders of magnitude faster than they do with conventional approaches.

    The researchers tested the system, called GAZELLE, on two-party image-classification tasks. A user sends encrypted image data to an online server evaluating a CNN running on GAZELLE. After this, both parties share encrypted information back and forth in order to classify the user’s image. Throughout the process, the system ensures that the server never learns any uploaded data, while the user never learns anything about the network parameters. Compared to traditional systems, however, GAZELLE ran 20 to 30 times faster than state-of-the-art models, while reducing the required network bandwidth by an order of magnitude.

    One promising application for the system is training CNNs to diagnose diseases. Hospitals could, for instance, train a CNN to learn characteristics of certain medical conditions from magnetic resonance images (MRI) and identify those characteristics in uploaded MRIs. The hospital could make the model available in the cloud for other hospitals. But the model is trained on, and further relies on, private patient data. Because there are no efficient encryption models, this application isn’t quite ready for prime time.

    “In this work, we show how to efficiently do this kind of secure two-party communication by combining these two techniques in a clever way,” says first author Chiraag Juvekar, a PhD student in the Department of Electrical Engineering and Computer Science (EECS). “The next step is to take real medical data and show that, even when we scale it for applications real users care about, it still provides acceptable performance.”

    Co-authors on the paper are Vinod Vaikuntanathan, an associate professor in EECS and a member of the Computer Science and Artificial Intelligence Laboratory, and Anantha Chandrakasan, dean of the School of Engineering and the Vannevar Bush Professor of Electrical Engineering and Computer Science.

    Maximizing performance

    CNNs process image data through multiple linear and nonlinear layers of computation. Linear layers do the complex math, called linear algebra, and assign some values to the data. At a certain threshold, the data is outputted to nonlinear layers that do some simpler computation, make decisions (such as identifying image features), and send the data to the next linear layer. The end result is an image with an assigned class, such as vehicle, animal, person, or anatomical feature.

    Recent approaches to securing CNNs have involved applying homomorphic encryption or garbled circuits to process data throughout an entire network. These techniques are effective at securing data. “On paper, this looks like it solves the problem,” Juvekar says. But they render complex neural networks inefficient, “so you wouldn’t use them for any real-world application.”

    Homomorphic encryption, used in cloud computing, receives and executes computation all in encrypted data, called ciphertext, and generates an encrypted result that can then be decrypted by a user. When applied to neural networks, this technique is particularly fast and efficient at computing linear algebra. However, it must introduce a little noise into the data at each layer. Over multiple layers, noise accumulates, and the computation needed to filter that noise grows increasingly complex, slowing computation speeds.

    Garbled circuits are a form of secure two-party computation. The technique takes an input from both parties, does some computation, and sends two separate inputs to each party. In that way, the parties send data to one another, but they never see the other party’s data, only the relevant output on their side. The bandwidth needed to communicate data between parties, however, scales with computation complexity, not with the size of the input. In an online neural network, this technique works well in the nonlinear layers, where computation is minimal, but the bandwidth becomes unwieldy in math-heavy linear layers.

    The MIT researchers, instead, combined the two techniques in a way that gets around their inefficiencies.

    In their system, a user will upload ciphertext to a cloud-based CNN. The user must have garbled circuits technique running on their own computer. The CNN does all the computation in the linear layer, then sends the data to the nonlinear layer. At that point, the CNN and user share the data. The user does some computation on garbled circuits, and sends the data back to the CNN. By splitting and sharing the workload, the system restricts the homomorphic encryption to doing complex math one layer at a time, so data doesn’t become too noisy. It also limits the communication of the garbled circuits to just the nonlinear layers, where it performs optimally.

    “We’re only using the techniques for where they’re most efficient,” Juvekar says.

    Secret sharing

    The final step was ensuring both homomorphic and garbled circuit layers maintained a common randomization scheme, called “secret sharing.” In this scheme, data is divided into separate parts that are given to separate parties. All parties synch their parts to reconstruct the full data.

    In GAZELLE, when a user sends encrypted data to the cloud-based service, it’s split between both parties. Added to each share is a secret key (random numbers) that only the owning party knows. Throughout computation, each party will always have some portion of the data, plus random numbers, so it appears fully random. At the end of computation, the two parties synch their data. Only then does the user ask the cloud-based service for its secret key. The user can then subtract the secret key from all the data to get the result.

    “At the end of the computation, we want the first party to get the classification results and the second party to get absolutely nothing,” Juvekar says. Additionally, “the first party learns nothing about the parameters of the model.”

  • Institute Professor Thomas Magnanti has been honored as one of Singapore’s National Day Award recipients, for his long-term work developing higher education in Singapore.

    The government of Singapore announced that Magnanti received the Public Administration Medal (gold) on Aug. 9, the National Day of Singapore, for his role as founding president of the Singapore University of Technology and Design (SUTD). He will receive the medal at a ceremony in Singapore later this year.

    “I am quite pleased,” Magnanti says about the award. “It’s quite an honor to receive it.”

    SUTD is a recently developed university in Singapore focused on innovation-based technology, and design across several fields. Its curriculum is organized in interdisciplinary clusters to promote research and education across multiple areas of study.

    The new honor came as a surprise to Magnanti, who started working to help develop SUTD in 2008 and became its president in October 2009. In January 2010, MIT and SUTD signed a memorandum outlining their partnership for both research and education. After a groundbreaking in 2011, SUTD enrolled its first undergraduate students in 2012 and moved to its permanent campus site in 2015.

    MIT and SUTD maintained their education partnership from 2010 to 2017 and continue to work as partners in research through the International Design Center, which has facilities both at MIT and on the SUTD campus.

    Magnanti, who is an MIT Institute Professor, is a professor of operations research at the MIT Sloan School of Management, as well as a faculty member in the Department of Electrical Engineering and Computer Science. He is also a former dean of the School of Engineering. Magnanti is an expert on optimization whose work has spanned business and engineering, as well as the theoretical and applied sides of his field.

    As an MIT faculty member, he first started working with Singaporean leaders in the late 1990s, helping to develop the Singapore-MIT Alliance (SMA), as well as the Singapore-MIT Alliance for Research and Technology (SMART), a research enterprise established in 2007 between MIT and the National Research Foundation of Singapore (NRF).

    Magnanti says his time working on joint educational projects involving MIT and Singapore has been “a wonderful experience.”

    Singapore, Magnanti adds, has consistently maintained “a deep commitment to education and to research, and has a very strong relationship with MIT, which has sustained itself now for over 20 years.”

    Magnanti says he is pleased by the solid footing now established by the projects he has worked on in Singapore. 

    “There have been many highlights,” Magnanti says, including the development of an innovative university and degree structure, and novel pedagogy and research. He notes that students from SUTD “have done very well in their placements, in Singapore. Remarkably well.”

    Overall, Magnanti adds, simply “developing the university has been one of the highlights. Hiring faculty, bringing in outstanding students and staff. … I am, and I think MIT is, very proud of what’s happened with the university.”

August 16, 2018

  • A miniature satellite called ASTERIA (Arcsecond Space Telescope Enabling Research in Astrophysics) has measured the transit of a previously-discovered super-Earth exoplanet, 55 Cancri e. This finding shows that miniature satellites, like ASTERIA, are capable of making of sensitive detections of exoplanets via the transit method.

    While observing 55 Cancri e, which is known to transit, ASTERIA measured a miniscule change in brightness, about 0.04 percent, when the super-Earth crossed in front of its star. This transit measurement is the first of its kind for CubeSats (the class of satellites to which ASTERIA belongs) which are about the size of a briefcase and hitch a ride to space as secondary payloads on rockets used for larger spacecraft.

    The ASTERIA team presented updates and lessons learned about the mission at the Small Satellite Conference in Logan, Utah, last week.  

    The ASTERIA project is a collaboration between MIT and NASA's Jet Propulsion Laboratory (JPL) in Pasadena, California, funded through JPL's Phaeton Program. The project started in 2010 as an undergraduate class project in 16.83/12.43 (Space Systems Engineering), involving a technology demonstration of astrophysical measurements using a Cubesat, with a primary goal of training early-career engineers.

    The ASTERIA mission was designed to demonstrate key technologies, including very stable pointing and thermal control for making extremely precise measurements of stellar brightness in a tiny satellite. Its principal investigator is Sara Seager, the Class of 1941 Professor Chair in MIT’s Department of Earth, Atmospheric and Planetary Sciences with appointments in the departments of Physics and Aeronautics and Astronautics. Earlier this year, ASTERIA achieved pointing stability of 0.5 arcseconds and thermal stability of 0.01 degrees Celsius. These technologies are important for precision photometry, i.e., the measurement of stellar brightness over time.

    Precision photometry, in turn, provides a way to study stellar activity, transiting exoplanets, and other astrophysical phenomena. Several MIT alumni have been involved in ASTERIA's development from the beginning including Matthew W. Smith PhD '14, Christopher Pong ScD '14, Alessandra Babuscia PhD '12, and Mary Knapp PhD '18. Brice-Olivier Demory, a professor at the University of Bern and a former EAPS postdoc who is also a member of the ASTERIA science team, performed the data reduction that revealed the transit.

    ASTERIA's success demonstrates that CubeSats can perform big science in a small package. This finding has earned ASTERIA the honor of "Mission of the Year,” which was awarded at the SmallSat conference. The honor is presented annually to the mission that has demonstrated a significant improvement in the capability of small satellites, which weigh less than 150 kilograms. Eligible missions have launched, established communication, and acquired results from on-orbit after Jan, 1, 2017.

    Now that ASTERIA has proven that it can measure exoplanet transits, it will continue observing two bright, nearby stars to search for previously unknown transiting exoplanets. Additional funding for ASTERIA operations was provided by the Heising-Simons Foundation

  • MIT researchers have developed a tool that makes it much easier and more efficient to explore the many compromises that come with designing new products.

    Designing any product — from complex car parts down to workaday objects such as wrenches and lamp stands — is a balancing act with conflicting performance tradeoffs. Making something lightweight, for instance, may compromise its durability.

    To navigate these tradeoffs, engineers use computer-aided design (CAD) programs to iteratively modify design parameters — say, height, length, and radius of a product — and simulate the results for performance objectives to meet specific needs, such as weight, balance, and durability.

    But these programs require users to modify designs and simulate the results for only one performance objective at a time. As products usually must meet multiple, conflicting performance objectives, this process becomes very time-consuming.

    In a paper presented at this week’s SIGGRAPH conference, researchers from the Computer Science and Artificial Intelligence Laboratory (CSAIL) describe a visualization tool for CAD that, for the first time, lets users instead interactively explore all designs that best fit multiple, often-conflicting performance tradeoffs, in real time.

    The tool first calculates optimal designs for three performance objectives in a precomputation step. It then maps all those designs as color-coded patches on a triangular graph. Users can move a cursor in and around the patches to prioritize one performance objective or another. As the cursor moves, 3-D designs appear that are optimized for that exact spot on the graph.

    “Now you can explore the landscape of multiple performance compromises efficiently and interactively, which is something that didn’t exist before,” says Adriana Schulz, a CSAIL postdoc and first author on the paper.

    Co-authors on the paper are Harrison Wang, a graduate student in mechanical engineering; Eitan Grinspun, an associate professor of computer science at Columbia University; Justin Solomon, an assistant professor in electrical engineering and computer science; and Wojciech Matusik, an associate professor in electrical engineering and computer science.

    The new work builds off a tool, InstantCAD, developed last year by Schulz, Matusik, Grinspun, and other researchers. That tool let users interactively modify product designs and get real-time information on performance. The researchers estimated that tool could reduce the time of some steps in designing complex products to seconds or minutes, instead of hours.

    However, a user still had to explore all designs to find one that satisfied all performance tradeoffs, which was time-consuming. This new tool represents “an inverse,” Schulz says: “We’re directly editing the performance space and providing real-time feedback on the designs that give you the best performance. A product may have 100 design parameters … but we really only care about how it behaves in the physical world.”

    In the new paper, the researchers home in on a critical aspect of performance called the “Pareto front,” a set of designs optimized for all given performance objectives, where any design change that improves one objective worsens another objective. This front is usually represented in CAD and other software as a point cloud (dozens or hundreds of dots in a multidimensional graph), where each point is a separate design. For instance, one point may represent a wrench optimized for greater torque and less mass, while a nearby point will represent a design with slightly less torque, but more mass.

    Engineers laboriously modify designs in CAD to find these Pareto-optimized designs, using a fair amount of guesswork. Then they use the front’s visual representation as a guideline to find a product that meets a specific performance, considering the various compromises.

    The researchers’ tool, instead, rapidly finds the entire Pareto front and turns it into an interactive map. Inputted into the model is a product with design parameters, and information about how those parameters correspond to specific performance objectives.

    The model first quickly uncovers one design on the Pareto front. Then, it uses some approximation calculations to discover tiny variations in that design. After doing that a few times, it captures all designs on the Pareto front. Those designs are mapped as colored patches on a triangular graph, where each patch represents one Pareto-optimal design, surrounded by its slight variations. Each edge of the graph is labeled with a separate performance objective based on the input data.

    In their paper, the researchers tested their tool on various products, including a wrench, bike frame component, and brake hub, each with three or four design parameters, as well as a standing lamp with 21 design parameters.

    With the lamp, for example, all 21 parameters relate to the thickness of the lamp’s base, height and orientation of its stand, and length and orientation of three elbowed beams attached to the top that hold the light bulbs. The system generated designs and variations corresponding to more than 50 colored patches reflecting a combination of three performance objectives: focal distance, stability, and mass. Placing the cursor on a patch closer to, say, focal distance and stability generates a design with a taller, straighter stand and longer beams oriented for balance. Moving the cursor farther from focal distance and toward mass and stability generates a design with thicker base and a shorter stand and beams, tilted at different angles.

    Some designs change quite dramatically around the same region of performance tradeoffs and even within the same cluster. This is important from an engineer’s perspective, Schulz says. “You’re finding two designs that, even though they’re very different, they behave in similar ways,” she says. Engineers can use that information “to find designs that are actually better to meet specific use cases.”

    “This work is an important contribution to interactive design of functional real-world objects,” says Takeo Igarashi, a professor of computer science at the University of Tokyo, and an expert in graphic design. Existing computational design tools, Igarashi says, make it difficult for designers to explore design trade-offs. “The tools work as black box and allow no or limited user control,” he says. “This work explicitly addresses this not-yet-tackled important problem. … [It] builds on a solid technical foundation, and the ideas and techniques in this paper will influence the design of design tools in the future.”

    The work was supported by the Defense Advanced Research Projects Agency, the Army Research Office, the Skoltech-MIT Next Generation Program, and the National Science Foundation.

  • The postdoctoral training period is a time when junior researchers learn what it takes to become independent investigators. Pursuing a career in biomedical research can be highly demanding, and young researchers often feel challenged to find time to reflect on various career possibilities, explore options of interest, develop associated professional skills, and still maintain an acceptable work-life balance.

    At MIT, about 1,500 postdocs are appointed to more than 50 departments, and serve as vital members of the Institute’s research workforce. 

    “Expectations for faculty members, particularly in the biomedical sciences, have evolved quite significantly from what they used to be say 10, 20 years ago,” says Sangeeta N. Bhatia, the John J. and Dorothy Wilson Professor of Health Sciences and Technology and Electrical Engineering and Computer Science, and the inaugural director of the Koch Institute’s Marble Center for Cancer Nanomedicine. “Professors are not just managing the lab and classroom mediums, but also serving on advisory boards and launching new research and commercial ventures. They also advocate for evidence-based policy and engage with the wider public about the implications of their research.”

    The Marble Center, established through a generous gift from Kathy and Curt Marble ’63, launched the Convergence Scholars Program on National Nanotechnology Day on Oct. 9, 2017, a date that corresponds to the scientific notation — 10-9 — that designates the nanoscale. The aim of the program is to help postdocs hone the skills they need to succeed within and beyond the academic setting.

    When divergent needs converge

    The brainchild of Tarek R. Fadel, assistant director of the Marble Center, the Convergence Scholars Program is designed to offer career development opportunities that address the needs of the individual trainee and the ever-changing landscape of research. For example, monthly workshops focus on topics such as science communication and management, and leadership of scientific workplaces.

    “Virtually all postdocs contemplate careers in academia. I typically hear trainees ask for advice on how they will recruit new scientists, develop budgets, manage multiple and often overlapping projects, and resolve potential conflicts between collaborators,” Fadel says. “We want our young scientists to think about these issues early in their training and to grow a wide network of mentors on whom they can rely during this transitional phase of their career.”

    Jacob Martin, a Convergence Scholar in the laboratory of Koch Institute Associate Director Darrell Irvine, a professor of biological engineering and of materials science and engineering, recalls his apprehension about funding and other challenges associated with an academic research career.

    “One of the reasons I felt encouraged to apply for the program was that, beyond acknowledging that I should consider ‘alternative’ careers, I didn’t know where to start,” Martin says. “Of course, I knew of some of the options in the biopharmaceutical industry, but I really wanted to put everything back on the table and consider other careers that I might never have realized would be available and enjoyable for me. This idea seemed exciting but also daunting — frankly, overwhelming.”

    Fadel adds that “one of the key aims of the Convergence Scholars Program is to serve as a centralized resource, connecting postdocs with training and opportunities without requiring the time or anxiety of having to figure everything out themselves.”

    The program also offers insight and inroads into careers in industry, health care, the policy arena, or with federal research or regulatory agencies. In order to offer this wide variety of resources for participants, the program partners with organizations around MIT and off campus, including the MIT BE Communications Lab and Harvard Catalyst. The program also engages a network of mentors from the pharmaceutical industry, the government sector, and elsewhere.

    Taking full advantage of the array of opportunities available, recent Convergence Scholar Briana Dunn worked with the education and outreach team at the National Nanotechnology Coordination Office and volunteered doing hands-on nanotechnology experiments with children and families at a national science event. She also explored options in health care and joined the American Medical Writers Association, enrolling in courses to learn more about medical writing and even earning a credential.

    “I was lucky that I had the opportunity to explore my interests in an organized and thoughtful way,” says Dunn, then a member of the laboratory of Angela Belcher, the James Mason Crafts Professor.

    Off to a strong start

    Six postdocs were selected for the inaugural class of Convergence Scholars, one from each of the Marble Center’s member labs:

    In addition to training opportunities, each scholar also receives a stipend to use for professional activities and travel. This year, such activities ranged from volunteering at the U.S. Science and Engineering Festival and Family Science Days held as part of the annual meeting of the American Association for the Advancement of Science (AAAS), to participating in workshops on leadership in bioscience at the Cold Spring Harbor Laboratory, and science policy at AAAS.

    Although the first year of the Convergence Scholars Program has not yet come to a close, participants say through their reviews that the initiative is on the right track. Dunn, for example, has found a job in industry that combines several of her interests.

    “Through CSP, I was able to explore my options more deeply and in a way that really focused on my professional development,” she says.

    Bhatia and Fadel envision that the next cohort — to be announced in October — will also include postdocs from other centers within the Koch Institute for Integrative Cancer Research at MIT, where the Marble Center is housed.

August 14, 2018

  • Three high school girls whose entry in a NASA technology competition was targeted for derailment by a racist and misogynistic online forum were honored by an MIT group recently to celebrate their achievement in engineering a water-filtration system using aerospace technology.

    AeroAfro, an unofficial student group within the Department of Aeronautics and Astronautics, sponsored a campus visit on Aug. 3 for the three girls, in recognition of their placing second in NASA’s Optimus Prime Spinoff Promotion and Research Challenge (OPSPARC). The students toured the campus and presented an overview of their winning entry to an appreciative MIT audience.

    AeroAfro's stated goal is to advance the diversity and inclusion initiative of AeroAstro. Graduate student Stewart Isaacs, who coordinated the visit, said members of the group were “disgusted” by the girls’ experience with online racism.

    “We sought to support their careers in STEM by connecting them with the cutting-edge science and technology resources at MIT,” he said.

    The annual OPSPARC competition challenges students to research NASA technology and develop innovative ways to apply it to everyday situations. OPSPARC 2018 received more than 1,500 entrants from the U.S. and Canada.

    In April, OPSPARC announced that the team of Mikayla Sharrieff, India Skinner, and Bria Snell was one of eight finalists in this year’s contest. The 17-year-old juniors attend Banneker High School in Washington, where they are in the school’s STEM program. Theirs was the only all-black, all-female team to reach the finals.

    Concerned about lead content in their school's drinking fountain water, the three named their project “From H2NO to H2O.” They used simple equipment including glass jars, water contaminated with copper shards, an electric fan, and filtering floss to develop a filtration system that produces water pure and clean enough to drink. Their research was inspired by NASA water purification technology, and conducted at the Inclusive Innovation Incubator (In3), a technology lab near Howard University, where they also volunteer.

    “We were impressed how their project used aerospace technology to help solve the problem of clean water access.” Isaacs said. “This is a problem facing many majority black communities in the United States. Their project presentation gave MIT researchers an opportunity to learn how we can follow their lead to support neglected populations with our own work.”

    OPSPARC winners are determined by an assessment of the projects by NASA judges and by public voting promoted via social media, and as soon as the finalists were announced, the voting was underway. But only four days later, users from the anonymous Internet forum 4chan, a site infamous for destructive hoaxes and racist, misogynistic, and homophobic comments, attempted to divert votes from the Banneker team.

    The 4chan users launched a cyberattack against the young women, arguing that the black community was voting for them only because of their race, and recommending computer programs that would hack the NASA system and give the advantage to a team of boys. Reacting to the attack, NASA announced an early termination to public voting.

    “If you want to be successful in this world, you’ll be targeted,” said Snell at the team's presentation. “But it’s OK. You have to persevere.”

    Their determination and belief in their project paid off. In May, NASA’s Goddard Space Flight Center announced that the Banneker team had placed second in the OPSPARC competition.

    The team was also awarded a $4,000 grant by Washington mayor Muriel E. Bowser in recognition of their success. The grant was given to In3 and used by the young women to purchase materials to implement their water purification system in the D.C. area.

    Sharrieff, Skinner, and Snell hope to expand From H2NO to H2O by incorporating a metal filter to extract chlorine from drinking water, as well. Their project has long-ranging potential, from alleviating the water crises in Flint and Baltimore to improving the potability of water in third-world countries.

    When they’re not busy in the lab, the three participate on their school’s cheerleading squad, and Snell runs on the Banneker Bulldogs track team. They all plan to attend college in preparation for science-based careers: Sharrieff in biomedical engineering, Skinner in pediatric surgery, and Snell in anesthesiology.

    The students’ visit included a tour of MIT’s Wright Brothers Wind Tunnel, the Media Lab, and other facilities, and introductions to MIT professors and researchers. “What an amazing opportunity,” Sharrieff said of her time on campus. “It opened my eyes to what I want to do in the future regarding graduate school.”

    Skinner echoed her teammate. “It was great to discover what life is like at MIT. It’s truly amazing here.”

  • The latest development in textiles and fibers is a kind of soft hardware that you can wear: cloth that has electronic devices built right into it.

    Researchers at MIT have now embedded high speed optoelectronic semiconductor devices, including light-emitting diodes (LEDs) and diode photodetectors, within fibers that were then woven at Inman Mills, in South Carolina, into soft, washable fabrics and made into communication systems. This marks the achievement of a long-sought goal of creating “smart” fabrics by incorporating semiconductor devices — the key ingredient of modern electronics — which until now was the missing piece for making fabrics with sophisticated functionality.

    This discovery, the researchers  say, could unleash a new “Moore’s Law” for fibers — in other words, a rapid progression in which the capabilities of fibers would grow rapidly and exponentially over time, just as the capabilities of microchips have grown over decades.

    The findings are described this week in the journal Nature in a paper by former MIT graduate student Michael Rein; his research advisor Yoel Fink, MIT professor of materials science and electrical engineering and CEO of AFFOA (Advanced Functional Fabrics of America); along with a team from MIT, AFFOA, Inman Mills, EPFL in Lausanne, Switzerland, and Lincoln Laboratory.

    A spool of fine, soft fiber made using the new process shows the embedded LEDs turning on and off to demonstrate their functionality. The team has used similar fibers to transmit music to detector fibers, which work even when underwater. (Courtesy of the researchers)

    Optical fibers have been traditionally produced by making a cylindrical object called a “preform,” which is essentially a scaled-up model of the fiber, then heating it. Softened material is then drawn or pulled downward under tension and the resulting fiber is collected on a spool.

    The key breakthrough for producing  these new fibers was to add to the preform light-emitting semiconductor diodes the size of a grain of sand, and a pair of copper wires a fraction of a hair’s width. When heated in a furnace during the fiber-drawing process, the polymer preform partially liquified, forming a long fiber with the diodes lined up along its center and connected by the copper wires.

    In this case, the solid components were two types of electrical diodes made using standard microchip technology: light-emitting diodes (LEDs) and photosensing diodes. “Both the devices and the wires maintain their dimensions while everything shrinks around them” in the drawing process, Rein says. The resulting fibers were then woven into fabrics, which were laundered 10 times to demonstrate their practicality as possible material for clothing.

    “This approach adds a new insight into the process of making fibers,” says Rein, who was the paper’s lead author and developed the concept that led to the new process. “Instead of drawing the material all together in a liquid state, we mixed in devices in particulate form, together with thin metal wires.”

    One of the advantages of incorporating function into the fiber material itself is that the resulting  fiber is inherently waterproof. To demonstrate this, the team placed some of the photodetecting fibers inside a fish tank. A lamp outside the aquarium transmitted music (appropriately, Handel’s “Water Music”) through the water to the fibers in the form of rapid optical signals. The fibers in the tank converted the light pulses — so rapid that the light appears steady to the naked eye — to electrical signals, which were then converted into music. The fibers survived in the water for weeks.

    Though the principle sounds simple, making it work consistently, and making sure that the fibers could be manufactured reliably and in quantity, has been a long and difficult process. Staff at the Advanced Functional Fabric of America Institute, led by Jason Cox and Chia-Chun Chung, developed the pathways to increasing yield, throughput, and overall reliability, making these fibers ready for transitioning to industry. At the same time, Marty Ellis from Inman Mills developed techniques for weaving these fibers into fabrics using a conventional industrial manufacturing-scale loom.

    “This paper describes a scalable path for incorporating semiconductor devices into fibers. We are anticipating the emergence of a ‘Moore’s law’ analog in fibers in the years ahead,” Fink says. “It is already allowing us to expand the fundamental capabilities of fabrics to encompass communications, lighting, physiological monitoring, and more. In the years ahead fabrics will deliver value-added services and will no longer just be selected for aesthetics and comfort.”

    He says that the first commercial products incorporating this technology will be reaching the marketplace as early as next year — an extraordinarily short progression from laboratory research to commercialization. Such rapid lab-to-market development was a key part of the reason for creating an academic-industry-government collaborative such as AFFOA in the first place, he says. These initial applications will be specialized products involving communications and safety. “It's going to be the first fabric communication system. We are right now in the process of transitioning the technology to domestic manufacturers and industry at an unprecendented speed and scale,” he says.

    In addition to commercial applications, Fink says the U.S. Department of Defense — one of AFFOA’s major supporters — “is exploring applications of these ideas to our women and men in uniform.”

    Beyond communications, the fibers could potentially have significant applications in the biomedical field, the researchers say. For example, devices using such fibers might be used to make a wristband that could measure pulse or blood oxygen levels, or be woven into a bandage to continuously monitor the healing  process.

    The research was supported in part by the MIT Materials Research Science and Engineering Center (MRSEC) through the MRSEC Program of the National Science Foundation, by the U.S. Army Research Laboratory and the U.S. Army Research Office through the Institute for Soldier Nanotechnologies. This work was also supported by the Assistant Secretary of Defense for Research and Engineering.

August 13, 2018

  • A group of high school students, some from as far away as Italy and China, came to MIT’s Edgerton Center this summer to learn more about what it takes to be an engineer — and learned a bit more about themselves as well.

    Now in its 12th year, the Edgerton Center’s Engineering Design Workshop (EDW) brought together 27 students in a month-long creative binge to flesh out their own projects. Some were practical, some were whimsical, but all were challenging and fun.

    The students started out the summer by learning basic electronics, mechanical fabrication, and a bit of 3-D printing. They then broke up into teams and brainstormed their own creations under the guidance of the program’s mentors, many of whom are EDW alumni themselves.

    This year’s final designs, which were showcased in a final presentation for the kids and their parents on Aug. 3, included an automated river water monitoring platform; an improved ship dry dock; an interactive light game; a monowheel unicycle; a bionic exoskeleton; and what can best be described as a cross between a Segway and a Nimbus 2000 broomstick from Harry Potter (but with cup holders).

    Many of the kids seemed shy at first to talk about their projects, but Edgerton Center instructor Chris Mayer gently urged them on.

    “Why don’t you bring that over to the audience, so they could have a closer look?” he said. Invariably, the kids’ close-up demonstrations of their work elicited amazed gasps and nods from the crowd.

    For some like Luo Yan, a senior high school student from the Shanghai Foreign Language School, the workshop was also a hands-on history lesson. He and his team researched the design of Boston’s historic Charlestown Navy Yard and built a scale mock-up showcasing their proposed improvements to its long-defunct dry dock.

    “This has a lot of stories behind it,” he said, pointing to the miniature replicas of the yard’s buildings, which were built in 1833. “I just want to see it working again.”

    Thirteen-year-old Mohan Hathi of Cambridge Rindge and Latin School said Team Exoskeleton’s big idea was to build an assistive system that could help with repetitive chores, like lifting heavy objects on an assembly line. But with barely a month to build a working prototype, they ended up having to decide whether to build a bionic hand or arm.

    “We decided, why not do both? And if it happens then it happens,” he said. “But it ended up working, and I’m really happy.”

    Team QUICK (short for aQuatic Underwater Information Collecting Kit) built a submersible sensor platform that could be used for environmental monitoring in the Charles River and other bodies of water. But barely had they presented their final design when the team was already considering how it could be improved — better battery life, perhaps, or more robust sensors.

    True to its name, Team LIT (Light Interactive Technology) designed an Arduino-controlled LED wall display, and even came up with a fairy-catching game to go along with it. In the game, a light "fairy" would flit about, and a controller box off to the side allowed players to light up parts of the wall to block its path.

    Meanwhile, teams Monowheel and Broomba showcased their unusual transport designs. The former was a one-wheeled single-track vehicle and the latter a self-balancing witch’s broomstick on wheels. More whimsical than practical, they nevertheless offered an interesting and fun way to get around.

    Though not everyone was able to get their creations off the ground, Edgerton Center instructor Ed Moriarty ’76 said, the experience is invaluable in itself. Moriarty has been with the workshop from the very beginning and has served as both mentor and friend to all its past and present participants.

    “We did not say that you have to succeed in building your project. We said you have to care about your project,” he explained. “We did not set this up as an instructional thing. This is, ‘Hey, what do you want to build?’ ‘Hey, let’s go try it!”

    “This isn’t about teaching,” he added. “This is about empowering students to get together and do things.”

    That's something that Moriarty takes to heart and has been sharing with high school students — or anyone who happens to drop by the Edgerton Center on a lazy Saturday afternoon — for years now. If you have a big idea, he believes you should always chase it down the rabbit hole, because no matter where you end up, it’ll always be an adventure.

  • Visitors roaming the MIT Stratton Student Center chatted with high school students stationed at various booths, as 3-D printers hummed and a remote-controlled inflatable shark swam above their heads. Down the street at the Johnson Ice Rink, self-driving miniature racecars hurtled down a racetrack while onlookers cheered them on. 

    This was the scene on Sunday, Aug. 5 at the final event of the Beaver Works Summer Institute (BWSI), a four-week summer science, technology, engineering, and math (STEM) program for rising high school seniors. BWSI is an initiative of Beaver Works, a research and education center jointly operated by MIT Lincoln Laboratory and the MIT School of Engineering. BWSI started in 2016 with 46 students. On Sunday, the program concluded its third year with 198 students from 105 schools around the country. 

    “The Beaver Works Summer Institute is a transformational high school program that finds and attracts talented and motivated students throughout the world to science and engineering,” said Professor Sertac Karaman, an academic director of BWSI and an associate professor in MIT’s Department of Aeronautics and Astronautics. “At their core, all our classes offer hands-on, project-based experiences that strengthen the students’ understanding of fundamental concepts in emerging technologies of tomorrow.” 

    This year’s BWSI featured eight courses: Autonomous RACECAR (Rapid Autonomous Complex-Environment Competing Ackermann-steering Robot) Grand Prix; Autonomous Air Vehicle Racing; Autonomous Cognitive Assistant; Medlytics: Data Science for Health and Medicine; Build a Cubesat; Unmanned Air System-Synthetic Aperture Radar (UAS-SAR); Embedded Security and Hardware Hacking; and Hack a 3-D Printer. All courses were supplemented by lectures, and by an online portion designed to teach the fundamentals of each topic, which students took prior to arriving at the program.

    Students from Mexico, Canada, and Nauset Regional High School in Massachusetts also participated in the courses remotely by building the technologies in their own classrooms and listening in on webcasted lectures. At the end of the program, they travelled to the MIT campus to participate in the final event.

    In the spirit of hands-on learning, students made their own 3-D printers and radars, security tested a home door lock system, and designed their own autonomous capabilities for unpiloted aerial vehicles (UAVs) and miniature racecars. Despite the challenging nature of the material, the students caught on quickly.

    “They constantly exceeded my already high expectations. Their ability to immediately engage with what is often graduate-level course materials is inspiring,” said Mark Mazumder, a lead instructor of the Autonomous Air Vehicle course and a Lincoln Laboratory staff member.

    Medlytics lead instructor and Lincoln Laboratory staff member Danelle Shah said the participants “amazed me every day.”

    “Not only are these students remarkably bright, but they are ambitious, curious, and passionate about making a difference,” Shah said.

    The students showed off the results of their hard work at the final event after four weeks of learning and building. During the first half of the day, the Autonomous RACECAR teams ran time trials at the Johnson Ice Rink in preparation for the final race later in the afternoon. At Building 31, students in the Autonomous Air Vehicle Racing course raced their drones around an LED track, while nearby, drones from the UAS-SAR course used their radar to image an object obscured by a tarp.

    Students from the remaining courses set up booths at the Stratton Student Center and talked to visitors about their projects, which included a design for a miniature satellite to be launched by NASA, a 3-D printer that could print icing on top of cakes, a cognitive assistant similar to Amazon’s Alexa, and a technique for using machine learning to detect cyberbullying on Twitter.

    The event culminated in a final grand prix-style race in which the autonomous RACECARs competed to finish an intricate racetrack. Various obstacles such as a windmill and bowling pins were placed on the path, which the cars had to navigate around using lidar, cameras, and motion sensors that the students had integrated into the vehicles. The race was followed by an awards ceremony and closing remarks from the BWSI organizers.

    “BWSI 2018 was a huge success, thanks to the passion and dedication of our staff and instructors and the enthusiasm of the students,” said Robert Shin, the director of Beaver Works. “In all eight engineering courses, the students far exceeded our expectations in their achievements. That validated our view that there is no limit to what STEM-focused high school students can achieve under the right circumstances. Our vision is to make this opportunity available to all passionate and motivated high school students everywhere.”

    For many BWSI participants, the program does not end after just four weeks. Nine of this year’s associate instructors were former BWSI students, including Karen Nguyen, who now attends MIT.

    “Before the program, MIT seemed like an unattainable goal. But working with the same autonomous racecars that were used by MIT students and professors allowed me to see that technology could be both advanced and accessible,” Nguyen said. Now on the teaching side of the program, Nguyen is still benefiting from it. “By helping instruct high school students on autonomous robotics, I furthered my own knowledge within the field and also learned how to make certain topics in STEM more approachable to a wider range of people,” she said.

    The organizers hope to expand the program in the future by helping schools develop their own local STEM programs based on the BWSI curricula. They are also brainstorming possible new course topics for next year, such as autonomous marine vehicles, disaster relief technologies, and an assistive technology hackathon. Although the third year of BWSI is now over, the organizers hope the program will have a lasting impact on the students.

    “At the end of the program, you just look at the things you accomplished and it utterly changes what you think high schoolers, given the right tools and guidance, are capable of,” said William Shi, a student who participated in the Embedded Security and Hardware Hacking course. “You feel empowered to go out and try your hand at even more ambitious things, to see how far you truly can go.”

August 12, 2018

  • Kristala Jones Prather is speaking in a packed MIT lecture hall. Many of her students wear reading glasses, some have a little less hair than they used to, and most of them are well dressed and groomed. But all of these engineers, biologists, chemists, microbiologists, and biochemists take furious notes in thick course binders and lean forward to study the equations she jots on the chalkboard.

    As Prather delves into Fermentation Technology, a short program offered by MIT Professional Education, she engages and challenges her students. “Do we have a few biochemists? Does this model remind you of anything?” she asks. “It may have been a dark time, but think back to your undergraduate biochemistry class,” she jokes before diving back into her lecture, one of 16 lectures the students will absorb. The course is the oldest in the MIT Professional Education catalog.

    Since 1962, this intensive program has attracted industry professionals to campus for five days that promise a review of the fundamentals in the application of biological and engineering principles to problems involving microbial, mammalian, and biological and biochemical systems.

    Fermentation Technology gathers a diverse array of professionals to glean the latest insights on terrain they navigate every day at work. It is an opportunity for them to gain knowledge of what might be coming next in biological and biochemical technology, with an emphasis on biological systems with industrial practices. Prather, the Arthur D. Little Professor of Chemical Engineering at MIT, oversees the course with Daniel I. C. Wang, an Institute Professor in the Department of Chemical Engineering.

    In addition to Prather and Wang, Fermentation Technology features a mix of guest lecturers that include other MIT faculty and industry professionals, such as Neal Connors from Phoenix BioConsulting in New Jersey, Kara Calhoun from the California biotech company Genentech, and Morris Z. Rosenberg, a biotech consultant in Washington.

    As she wraps up her first of two 90-minute lectures of the day, Prather deadpans: “Marinate on that over the break. I’m happy to answer questions when we come back if it’s still not making sense to you.”

    As the room empties for lunch, several of the visiting professionals make quick calls into the office or to check on family back home. Bill Morrison, a facilities engineer at BioMarin Pharmaceuticals in San Rafael, California, explains why he’s flown into Boston for hours of difficult lectures. He is moving into a process engineering role at his company and the course material is helpful for the most part. “I’m weak on the theory, but the other part about the mechanism of production is more up my alley,” he says.

    Katherine Wyndham from Novavax Inc., a clinical-stage vaccine company headquartered in Gaithersburg, Maryland, says she is a member of the manufacturing, science, and technology group at her company. “This course is really giving me a technical base for what I do,” she says. “I’d say 50 percent is directly applicable to stuff I use every day, and the other 50 percent provides me with new insight into what the process development group does.”

    Making additional notes at her lecture seat, Soniya Parulekar of Merck and Company, a global pharmaceutical company, has arrived from Philadelphia for the program. She works in fermentation research and development. “A lot of the things I’m seeing discussed in this course are giving me a better sense of what I’m working on — a deeper knowledge,” she says.

    Soon enough Prather is back from lunch. She begins to animatedly discuss modeling and bioprocess monitoring as industry professionals from across the country settle into their chairs to absorb as much information as they can.

    There are 2.5 days left of the course. Or to be exact, seven more lectures, including: perfusion reactors, medium design and high cell-density cultivation, power requirement in bioreactors, oxygen transfer and shear in bioreactors, design of experiments, analytics in biomanufacturing, and bioprocess simulation and economics. Attention in the room is still running high.

    For Prather, teaching a room full of professionals offers interesting opportunities as a teacher. “I teach the same material in my biochemical engineering class for undergraduates,” she says. “The short-course students bring a much richer perspective based on their own professional experiences. Sometimes,” she adds, “they teach me things that I can then offer to our own students.”

August 10, 2018

  • The rise of 5G, or fifth generation, mobile technologies is refashioning the wireless communications and networking industry. The School of Engineering recently asked Muriel Médard, the Cecil H. Green Professor in the Electrical Engineering and Computer Science Department at MIT, to explain what that means and why it matters.

    Médard, the co-founder of three companies to commercialize network coding — CodeOn, Steinwurf and Chocolate Cloud — is considered a global technology leader. Her work in network coding, hardware implementation, and her original algorithms have received widespread recognition and awards. At MIT, Médard leads the Network Coding and Reliable Communications Group at the Research Laboratory for Electronics.

    Q. People are hearing that 5G will transform industries across the world and bring advances in smart transportation, health care, wearables, augmented reality, and the internet of things. The media report that strategic players in the U.S. and internationally are developing these technologies for market by 2020 or earlier. What sets this generation apart from its predecessors?

    A. The reason 5G is so different is that what exactly it will look like is still up in the air. Everyone agrees the phrase is a bit of a catch-all. I’ll give you some big brush strokes on 5G and what people are looking at actively in the area.

    In second, third, and fourth generations, people got a phone service that by 4G really became a system of phone plus data. It was all fairly traditional. For instance, people are used to switching manually from their cellular provider to available Wi-Fi at their local coffee shop or wherever.

    One of the main ideas behind 5G is that you’ll have a single network that allows a blended offering. People are looking at using a multi-path approach, which means drawing on Wi-Fi and non-Wi-Fi 5G (or sometimes 4G) seamlessly. This poses some difficult coordination problems. It requires network coding, by using algebraic combinations, across different paths to create a single, smooth experience.

    Another important part of 5G is that people are looking at using millimeter waves, which occupy frequencies that are high enough to avoid interference among multiple senders that are transmitting simultaneously in fairly close proximity relative to what is possible now. These high frequencies, with wide open spectrum regions, may be well-suited for very large amounts of data that need to be transmitted over fairly short distances.

    There is also what people call “the fog,” which is something more than just how people feel in the morning before coffee. Fog computing, in effect, involves extending cloud capabilities, such as compute, storage and networking services, through various nodes and IoT gateways. It involves being able to draw on the presence of different users nearby in order to establish small, lightweight, rapidly set-up, rapidly torn-down, peer-to-peer type networks. Again, the right coding is extremely important so that we don't have difficult problems of coordination. You must be able to code across the different users and the different portions of the network.

    Q. You’ve described 5G as actively looking at incorporating services and modes of communications that have not been part of traditional offerings. What else sets it apart?

    A. Let’s talk about global reach. With 5G, people are looking at incorporating features, such as satellite service, that are seamlessly integrated with terrestrial service. For this, we also really need reliance on coding. You can imagine how there is no way you can rely on traditional coordination and scheduling across satellites and nodes on the ground on large scale.

    Another thing that makes 5G so different from other evolutions is the sheer volume of players. If you were talking about 3G or 4G, it was pretty straightforward. Your key players were doing equipment provisioning to service providers.

    Now it’s a very busy and more varied set of players. The different aspects that I’ve talked about are often not all considered by the same player. Some people are looking at worldwide coverage via satellite networking. Other people are looking at blending new channels, such as the millimeter wave ones I referred to earlier, with Wi-Fi, which basically requires marrying existing infrastructure with new ones.

    I think finding a coherent and central source of information is a big challenge. You have the organization that governs cellular standards, 3GPP, but the whole industry is transforming as we watch in the area of 5G. It’s not clear whether it’s going to be 3GPP still calling the shots. You have so many new entrants that are not necessarily part of the old guard.

    Q. What do you believe people will notice on a daily level with the rise of 5G?

    A. I’ll give you my vision for the future of 5G, with the caveat that we’re now moving into an area that is more a matter of opinion. I see heterogeneity as part of the design. You're going to have a network that is talking to a large and disparate set of nodes with very different purposes for very different applications. You’re going to see a view that emphasizes integration of existing and new resources over just the deployment of new resources.

    And I think the people who are going to win in 5G may not be the same players as before. It will be the company that figures out how to provide people with a seamless experience using the different substrates in a way that is highly opportunistic. It has to be a system that integrates everything naturally because you cannot preplan the satellite beam you're going to be in, the fog network you're going to be in, and the IoT devices that are going to be around you. There is no way even to maintain or manage so much information. Everything is becoming too complex and, in effect, organic. And my view on how to do that? Network coding. That’s an opinion but it’s a strongly held one.

August 8, 2018

  • When the FBI filed a court order in 2016 commanding Apple to unlock the iPhone of one of the shooters in a terrorist attack in San Bernandino, California, the news made headlines across the globe. Yet every day there are tens of thousands of court orders asking tech companies to turn over Americans’ private data. Many of these orders never see the light of day, leaving a whole privacy-sensitive aspect of government power immune to judicial oversight and lacking in public accountability.

    To protect the integrity of ongoing investigations, these requests require some secrecy: Companies usually aren’t allowed to inform individual users that they’re being investigated, and the court orders themselves are also temporarily hidden from the public.

    In many cases, though, charges never actually materialize, and the sealed orders usually end up forgotten by the courts that issue them, resulting in a severe accountability deficit.

    To address this issue, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and Internet Policy Research Initiative (IPRI) have proposed a new cryptographic system to improve the accountability of government surveillance while still maintaining enough confidentiality for the police to do their jobs.

    “While certain information may need to stay secret for an investigation to be done properly, some details have to be revealed for accountability to even be possible,” says CSAIL graduate student Jonathan Frankle, one of the lead authors of a new paper about the system, which they’ve dubbed “AUDIT” ("Accountability of Unreleased Data for Improved Transparency"). “This work is about using modern cryptography to develop creative ways to balance these conflicting issues.”

    Many of AUDIT’s technical methods were developed by one of its co-authors, MIT Professor Shafi Goldwasser. AUDIT is designed around a public ledger on which government officials share information about data requests. When a judge issues a secret court order or a law enforcement agency secretly requests data from a company, they have to make an iron-clad promise to make the data request public later in the form of what’s known as a “cryptographic commitment.” If the courts ultimately decide to release the data, the public can rest assured that the correct documents were released in full. If the courts decide not to, then that refusal itself will be made known.

    AUDIT can also be used to demonstrate that actions by law-enforcement agencies are consistent with what a court order actually allows. For example, if a court order leads to the FBI going to Amazon to get records about a specific customer, AUDIT can prove that the FBI’s request is above board using a cryptographic method called “zero-knowledge proofs.” First developed in the 1980s by Goldwasser and other researchers, these proofs counterintuitively make it possible to prove that surveillance is being conducted properly without revealing any specific information about the surveillance.

    The team's approach builds on privacy research in accountable systems led by co-author Daniel J. Weitzner, a principal research scientist at CSAIL and director of IPRI.

    “As the volume of personal information expands, better accountability for how that information is used is essential for maintaining public trust,” says Weitzner. “We know that the public is worried about losing control over their personal data, so building technology that can improve actual accountability will help increase trust in the internet environment overall.”

    Another element of AUDIT is that statistical information can be aggregated so that that the extent of surveillance can be studied at a larger scale. This enables the public to ask all sorts of tough questions about how their data are being shared. What kinds of cases are most likely to prompt court orders? How many judges issued more than 100 orders in the past year, or more than 10 requests to Facebook this month? Frankle says the team’s goal is to establish a set of reliable, court-issued transparency reports, to supplement the voluntary reports that companies put out.

    “We know that the legal system struggles to keep up with the complexity of increasing sophisticated users of personal data,” says Weitzner. “Systems like AUDIT can help courts keep track of how the police conduct surveillance and assure that they are acting within the scope of the law, without impeding legitimate investigative activity.”

    Importantly, the team developed its aggregation system using an approach called multi-party computation (MPC), which allows courts to disclose relevant information without actually revealing their internal workings or data to one another. The current state-of-the-art MPC would normally be too slow to run on the data of hundreds of federal judges across the entire court system, so the team took advantage of the court system’s natural hierarchy of lower and higher courts to design a particular variant of MPC that would scale efficiently for the federal judiciary.

    According to Frankle, AUDIT could be applied to any process in which data must be both kept secret but also subject to public scrutiny. For example, clinical trials of new drugs often involve private information, but also require enough transparency to assure regulators and the public that proper testing protocols are being observed.

    “It’s completely reasonable for government officials to want some level of secrecy, so that they can perform their duties without fear of interference from those who are under investigation,” Frankle says. “But that secrecy can’t be permanent. People have a right to know if their personal data has been accessed, and at a higher level, we as a public have the right to know how much surveillance is going on.”

    Next the team plans to explore what could be done to AUDIT so that it can handle even more complex data requests - specifically, by looking at tweaking the design via software engineering. They also are exploring the possibility of partnering with specific federal judges to develop a prototype for real-world use.

    “My hope is that, once this proof of concept becomes reality, court administrators will embrace the possibility of enhancing public oversight while preserving necessary secrecy,” says Stephen William Smith, a federal magistrate judge who has written extensively about government accountability. “Lessons learned here will undoubtedly smooth the way towards greater accountability for a broader class of secret information processes, which are a hallmark of our digital age.”

    Frankle co-wrote the paper with Goldwasser, Weitzner, CSAIL PhD graduate Sunoo Park and undergraduate Daniel Shaar. The paper will be presented at this week’s USENIX Security conference in Baltimore. IPRI team members will also discuss related surveillance issues in more detail at upcoming workshops for both USENIX and this week’s International Cryptography Conference (Crypto 2018) in Santa Barbara.

    The research was supported by IPRI, National Science Foundation, the Defense Advanced Research Projects Agency, and the Simons Foundation.

August 7, 2018

  • MIT chemical engineers have developed a new sensor that lets them see inside cancer cells and determine whether the cells are responding to a particular type of chemotherapy drug.

    The sensors, which detect hydrogen peroxide inside human cells, could help researchers identify new cancer drugs that boost levels of hydrogen peroxide, which induces programmed cell death. The sensors could also be adapted to screen individual patients’ tumors to predict whether such drugs would be effective against them.

    “The same therapy isn’t going to work against all tumors,” says Hadley Sikes, an associate professor of chemical engineering at MIT. “Currently there’s a real dearth of quantitative, chemically specific tools to be able to measure the changes that occur in tumor cells versus normal cells in response to drug treatment.”

    Sikes is the senior author of the study, which appears in the Aug. 7 issue of Nature Communications. The paper’s first author is graduate student Troy Langford; other authors are former graduate students Beijing Huang and Joseph Lim and graduate student Sun Jin Moon.

    Tracking hydrogen peroxide

    Cancer cells often have mutations that cause their metabolism to go awry and produce abnormally high fluxes of hydrogen peroxide. When too much of the molecule is produced, it can damage cells, so cancer cells become highly dependent on antioxidant systems that remove hydrogen peroxide from cells.

    Drugs that target this vulnerability, which are known as “redox drugs,” can work by either disabling the antioxidant systems or further boosting production of hydrogen peroxide. Many such drugs have entered clinical trials, with mixed results.

    “One of the problems is that the clinical trials usually find that they work for some patients and they don’t work for other patients,” Sikes says. “We really need tools to be able to do more well-designed trials where we figure out which patients are going to respond to this approach and which aren’t, so more of these drugs can be approved.”

    To help move toward that goal, Sikes set out to design a sensor that could sensitively detect hydrogen peroxide inside human cells, allowing scientists to measure a cell’s response to such drugs.

    Existing hydrogen peroxide sensors are based on proteins called transcription factors, taken from microbes and engineered to fluoresce when they react with hydrogen peroxide. Sikes and her colleagues tried to use these in human cells but found that they were not sensitive in the range of hydrogen peroxide they were trying to detect, which led them to seek human proteins that could perform the task.

    Through studies of the network of human proteins that become oxidized with increasing hydrogen peroxide, the researchers identified an enzyme called peroxiredoxin that dominates most human cells’ reactions with the molecule. One of this enzyme’s many functions is sensing changes in hydrogen peroxide levels.

    Langford then modified the protein by adding two fluorescent molecules to it — a green fluorescent protein at one end and a red fluorescent protein at the other end. When the sensor reacts with hydrogen peroxide, its shape changes, bringing the two fluorescent proteins closer together. The researchers can detect whether this shift has occurred by shining green light onto the cells: If no hydrogen peroxide has been detected, the glow remains green; if hydrogen peroxide is present, the sensor glows red instead.

    Predicting success

    The researchers tested their new sensor in two types of human cancer cells: one set that they knew was susceptible to a redox drug called piperlongumine, and another that they knew was not susceptible. The sensor revealed that hydrogen peroxide levels were unchanged in the resistant cells but went up in the susceptible cells, as the researchers expected.

    Sikes envisions two major uses for this sensor. One is to screen libraries of existing drugs, or compounds that could potentially be used as drugs, to determine if they have the desired effect of increasing hydrogen peroxide concentration in cancer cells. Another potential use is to screen patients before they receive such drugs, to see if the drugs will be successful against each patient’s tumor. Sikes is now pursuing both of these approaches.

    “You have to know which cancer drugs work in this way, and then which tumors are going to respond,” she says. “Those are two separate but related problems that both need to be solved for this approach to have practical impact in the clinic.”

    The research was funded by the Haas Family Fellowship in Chemical Engineering, the National Science Foundation, a Samsung Fellowship, and a Burroughs Wellcome Fund Career Award at the Scientific Interface.

August 6, 2018

  • Glioma, a type of brain cancer, is normally treated by removing as much of the tumor as possible, followed by radiation or chemotherapy. With this treatment, patients survive an average of about 10 years, but the tumors inevitably grow back.

    A team of researchers from MIT, Brigham and Women’s Hospital, and Massachusetts General Hospital hopes to extend patients’ lifespan by delivering directly to the brain a drug that targets a mutation found in 20 to 25 percent of all gliomas. (This mutation is usually seen in gliomas that strike adults under the age of 45.) The researchers have devised a way to rapidly check for the mutation during brain surgery, and if the mutation is present, they can implant microparticles that gradually release the drug over several days or weeks.

    “To provide really effective therapy, we need to diagnose very quickly, and ideally have a mutation diagnosis that can help guide genotype-specific treatment,” says Giovanni Traverso, an assistant professor at Brigham and Women’s Hospital, Harvard Medical School, a research affiliate at MIT’s Koch Institute for Integrative Cancer Research, and one of the senior authors of the paper.

    The researchers are also working ways to identify and target other mutations found in gliomas and other types of brain tumors.

    “This paradigm allows us to modify our current intraoperative resection strategy by applying molecular therapeutics that target residual tumor cells based on their specific vulnerabilities,” says Ganesh Shankar, who is currently completing a spine surgery fellowship at Cleveland Clinic prior to returning as a neurosurgeon at Massachusetts General Hospital, where he performed this study.

    Shankar and Koch Institute postdoc Ameya Kirtane are the lead authors of the paper, which appears in the Proceedings of the National Academy of Sciences the week of Aug. 6. Daniel Cahill, a neurosurgeon at MGH and associate professor at Harvard Medical School, is a senior author of the paper, and Robert Langer, the David H. Koch Institute Professor at MIT, is also an author.

    Targeting tumors

    The tumors that the researchers targeted in this study, historically known as low-grade gliomas, usually occur in patients between the ages of 20 and 40. During surgery, doctors try to remove as much of the tumor as possible, but they can’t be too aggressive if tumors invade the areas of the brain responsible for key functions such as speech or movement. The research team wanted to find a way to locally treat those cancer cells with a targeted drug that could delay tumor regrowth.

    To achieve that, the researchers decided to target a mutation called IDH1/2. Cancer cells with this mutation shut off a metabolic pathway that cells normally use to create a molecule called NAD, making them highly dependent on an alternative pathway that requires an enzyme called NAMPT. Researchers have been working to develop NAMPT inhibitors to treat cancer.

    So far, these drugs have not been used for glioma, in part because of the difficulty in getting them across the blood-brain barrier, which separates the brain from circulating blood and prevents large molecules from entering the brain. NAMPT inhibitors can also produce serious side effects in the retina, bone marrow, liver, and blood platelets when they are given orally or intravenously.

    To deliver the drugs locally, the researchers developed microparticles in which the NAMPT inhibitor is embedded in PLGA, a polymer that has been shown to be safe for use in humans. Another desirable feature of PLGA is that the rate at which the drug is released can be controlled by altering the ratio of the two polymers that make up PLGA — lactic acid and glycolic acid.

    To determine which patients would benefit from treatment with the NAMPT inhibitor, the researchers devised a genetic test that can reveal the presence of the IDH mutation in approximately 30 minutes. This allows the procedure to be done on biopsied tissue during the surgery, which takes about four hours. If the test is positive, the microparticles can be placed in the brain, where they gradually release the drug, killing cells left behind during the surgery.

    In tests in mice, the researchers found that treatment with the drug-carrying particles extended the survival of mice with IDH mutant-positive gliomas. As they expected, the treatment did not work against tumors without the IDH mutation. In mice treated with the particles, the team also found none of the harmful side effects seen when NAMPT inhibitors are given throughout the body.

    “When you dose these drugs locally, none of those side effects are seen,” Traverso says. “So not only can you have a positive impact on the tumor, but you can also address the side effects which sometimes limit the use of a drug that is otherwise effective against tumors.”

    The new approach builds on similar work from Langer’s lab that led to the first FDA-approved controlled drug-release system for brain cancer — a tiny wafer that can be implanted in the brain following surgery.

    “I am very excited about this new paper, which complements very nicely the earlier work we did with Henry Brem of Johns Hopkins that led to Gliadel, which has now been approved in over 30 countries and has been used clinically for the past 22 years,” Langer says.

    An array of options

    The researchers are now developing tests for other common mutations found in brain tumors, with the goal of devising an array of potential treatments for surgeons to choose from based on the test results. This approach could also be used for tumors in other parts of the body, the researchers say.

    “There’s no reason this has to be restricted to just gliomas,” Shankar says. “It should be able to be used anywhere where there’s a well-defined hotspot mutation.”

    They also plan to do some tests of the IDH-targeted treatment in larger animals, to help determine the right dosages, before planning for clinical trials in patients.

    “We feel its best use would be in the early stages, to improve local control and prevent regrowth at the site,” Cahill says. “Ideally it would be integrated early in the standard-of-care treatment for patients, and we would try to put off the recurrence of the disease for many years or decades. That’s what we’re hoping.”

    The research was funded by the American Brain Tumor Association, a SPORE grant from the National Cancer Institute, the Burroughs Wellcome Career Award in the Medical Sciences, the National Institutes of Health, and the Division of Gastroenterology at Brigham and Women’s Hospital.

  • In memory of MIT alumnus Samuel Ing '53, MS '54, ScD '59, his family has established a memorial fund to support graduate students at MIT’s Plasma Science and Fusion Center (PSFC) who are taking part in the center’s push to create a smaller, faster, and less expensive path to fusion energy.

    Samuel Ing was born in Shanghai, China in 1932. Mentored by Professor Thomas Sherwood at MIT, he received BS, MS, and ScD degrees in chemical engineering in 1953, 1954, and 1959 respectively. Joining the Xerox Corporation after graduation, he rose from senior scientist, to principal scientist, to senior vice president of the Xerographic Technology Laboratory at the Webster Research Center in Webster, New York. He spent most of his career in western New York State with his wife Mabel, whom he met at an MIT dance. They raised four daughters: Julie, Bonnie, Mimi, and Polly.

    An innovator and advocate for new technologies, including desktop publishing, Samuel Ing became intrigued with MIT’s approach to creating fusion energy after attending a talk by PSFC Director Dennis Whyte at the MIT Club in Palo Alto in early 2016. His daughter Emilie “Mimi” Slaughter ’87, SM ’88, who majored in electrical engineering, later expressed her own enthusiasm to her father when, as a member of the School of Engineering Dean’s Advisory Council, she heard Whyte speak in the fall of 2017.

    In pursuit of a clean and virtually endless source of energy to fulfill the growing demands around the world, MIT has championed fusion research since the 1970s, designing compact tokamaks that use high magnetic fields to heat and contain the plasma fuel in a donut-shaped vacuum chamber. The PSFC is now working on SPARC, a new high-field, net fusion energy experiment. Researchers are using a thin superconducting tape to create compact electromagnets with fields significantly higher than those available to any other current fusion experiment. These magnets would make it possible to build a smaller, high-field tokamak at less cost, while speeding the quest for fusion energy.

    Mimi Slaughter remembers her father’s passion for innovation and entrepreneurship.

    “It’s the MIT culture,” she says. “I see that in the fusion lab — the idea of just doing it; figuring out a way to try to make it happen, not necessarily through the traditional channels. I know my Dad agrees. He did that at Xerox. He had his own lab, creating his own desktop copiers. That grew out of what he experienced at MIT.”

    The Ing family is celebrating that creative spirit with the Samuel W. Ing Memorial Fund for MIT graduate students who will be driving the research and discovery forward on SPARC. It was a class of PSFC graduate students that proposed the original concept for this experiment, and it will be the young minds with new ideas that, with the support of the fund, will advance fusion research at MIT.

    Or as Sam Ing once said: “Very interesting technology. It has a tremendous future, and if anyone can do it, it’s MIT.”

  • Since 2012, a handful of Saudi Arabia’s top scientists and engineers have arrived on MIT’s campus every year for a once-in-a-lifetime experience. Through the Ibn Khaldun Fellowship for Saudi Arabian Women, these Saudi female scientists and engineers with PhDs are invited to spend one year conducting research at MIT. Each fellow is paired with an MIT faculty for a research project and exposed to a number of professional development opportunities.

    While the program was launched in MIT’s Department of Mechanical Engineering, the 27 fellows have been placed in 14 different departments, labs, and centers across the Institute including the Computer Science and Artificial Intelligence Laboratory, MIT Sloan School of Management, and the MIT Media Lab. Recently, 24 of these fellows convened on campus for the program’s first ever reunion.

    “For me, this week has felt like a family reunion,” remarks Kate Anderson, former program manager for the fellowship. Anderson, along with program director and professor of mechanical engineering Kamal Youcef-Toumi and recently appointed program manager Theresa Werth, has worked with each fellow on identifying the appropriate research project and helping them navigate their new surroundings at MIT.

    At the reunion, the five current fellows who are wrapping up their assignments at MIT provided an overview of their research projects. Topics ranged from carbon nanomaterials to solar cells and personalized drug screening. Alumni of the program then provided updates both on their research projects and how their careers have progressed since returning to Saudi Arabia.

    “It was very inspiring to hear about how so many fellows have built successful careers by taking bold steps that required a lot of courage,” recalls Areej Al-Wabil, 2015 fellow and current principal investigator at the Center for Complex Engineering at King Abdulaziz City for Science and Technology (KACST) and MIT.  

    In addition to serving as leaders in their respective research fields, many former fellows, such as Al-Wabil, have gone on to serve as deans, vice deans, directors of research centers, and principal investigators.

    “You are the change and you are the leaders,” said Youcef-Toumi in his remarks at the reunion. “Each one of you — your contributions are very significant. You are influencing people and inspiring them to become motivated just by interacting with you.”

    Finding sense of community inside and outside of the lab

    Since the program launched, the 27 fellows have authored 34 journal publications and 27 conference papers, and submitted four patents all directly related to the research they conducted at MIT.

    For current fellow Thamraa Al-Shahrani, her project afforded her the opportunity to research a topic that could have a major impact on her home country. She worked with Tonio Buonassisi, associate professor of mechanical engineering and director of the MIT Photovoltaics Research Laboratory, on developing solar cells that can function in hot or arid climates — like that of Saudi Arabia.


    “Working on something that has direct applications to my country has been amazing,” says Al-Shahrani. She and her team exposed solar cells to varying temperatures and studied their behavior in the hopes of determining how to improve temperature stability. Al-Shahrani served as first author on the resulting research paper, which was presented at the Materials Research Society meeting in April.

    Helping lead a team of scientists from MIT and Saudi Aramco did more than give Al-Shahrani critical research experience — she learned about the teamwork necessary for international research collaborations. “Tonio and his team were so supportive and encouraging,” she says. “If there was a problem with the study, we would all discuss it together to come up with a solution. The experience really gave me a sense of what it’s like to work on a team.” 

    This sense of community extended beyond the lab to her fellow Ibn Khaldun classmates. “The program connects fellows to each other — we got to celebrate Eid al-Fitr and Eid al-Adha together along with Kate and Theresa,” she recalls.

    The network the fellows create continues after the experience at MIT concludes. “An important part of the fellowship for me was building a network of professional in a similar phase of our careers,” adds Al-Wabil. This network has resulted in career opportunities for the fellows back in Saudi Arabia.

    Bringing hacker culture to Saudi Arabia

    For Al-Wabil, her work researching the design planning and ideation process in engineering projects, with Maria Yang, associate professor of mechanical engineering and director of MIT’s Ideation Laboratory, was just the start. “The experience was so rich,” says Al-Wabil. “I learned on so many fronts about scientific research, educational best practices, and professional development skills.”

    Al-Wabil served as a mentor for class 2.00b (Toy Design). In lab sessions, she and her fellow mentors brought their discipline-specific expertise to the design process to help guide students from the ideation phase to a working prototype. “In teaching we rarely touch on all phases, so that was an immersive learning experience for me as a teaching faculty,” she adds.

    While at MIT, Al-Wabil also immersed herself in hacker culture. After participating in the annual Assistive Technologies Hackathon, she was inspired to bring the hackathon platform back to her institution in Saudi Arabia. “I was able to take that experience and introduce the same concept in Saudi at a smaller scale,” says Al-Wabil. In October, her institution will host a “Hacking Medicine” hackathon using the MIT hacking model.

    Empowering a new generation of Saudi women

    As Saudi Arabia works toward Saudi Vision 2030, which hopes to increase women’s participation in the workforce from 22 percent to 30 percent by the year 2030, programs like the Ibn Khaldun Fellowship take on a greater purpose.

    “These women are not only contributing to the science and technology development of the Kingdom, they are changing the country as we speak,” says Anderson.

    Al-Shahrani sees the value of the program at this particular time in Saudi Arabia’s history. “The fellowship has helped a new generation of Saudi women build their leadership skills,” she notes. “This is especially important now as Saudi Arabia is making big changes and supporting women in the workforce.”

    In March, the program announced an agreement with new sponsor KACST to extend the fellowship for the next decade. In January 2019, five new fellows will join a community of women who are shaping the future of science and engineering in their country. 

  • For the past three years, the Department of Defense’s Naval Air Systems Command (NAVAIR) organization has committed to a different kind of mission than any it has pursued before — to transform their engineering acquisition capabilities to a model-based design. Their goal is to shorten the timeline from beginning to delivery without lacking quality or precision.

    Since early in 2017, an essential part of implementing that transformation has been NAVAIR’s participation in the MIT program, “Architecture and Systems Engineering: Models and Methods to Manage Complex Systems,” a four-course online course on model-based systems engineering.

    “It is taking way too long to develop and deliver the next generation of war fighting capability to our war fighters,” says David Cohen, director of the Air Platform Systems Engineering Department at NAVAIR, referring to the current design and development processes based on systems engineering practices and processes from the 1970s. “We need to shorten that timeline dramatically. We have a national security imperative to be delivering the next level of technology to our warfighter to continue to try to maintain our advantage over our adversaries.”

    NAVAIR views the shift to model-based systems engineering as an essential step in shortening and modernizing its abilities to deliver high-quality, state-of-the-art programs. They enrolled their first cohort of 60 engineers and managers into the MIT program in March 2017. The third group will soon complete the four-month program, which has become a key piece of the NAVAIR transformation by building the awareness and skills needed to successfully implement model-based systems engineering.

    Procuring naval aviation assets

    NAVAIR procures and helps sustain all of the Navy and Marine Corps aviation assets — helicopters, jets, transport aircraft, bombs, avionics, missiles, virtually any kind of weapon used by U.S. sailors and Marines. Their responsibilities include research, design, development, and systems engineering of these assets internally and with contractors; acquisition, testing and evaluation of these assets, as well as training, repair, modification, and in-service engineering and logistics support.

    “We are the organization that receives requirements from the Pentagon for a new program, puts them out on contract, does the acquisition of that project and also provides the technical oversight and programmatic oversight during the development of that project to be sure it is maturing as expected and delivering what is needed,” says David Meiser, Advanced Systems Engineering Department head, who is helping to lead the systems transformation effort at NAVAIR.

    NAVAIR employs more than 10,000 engineers, plus logisticians, testers, and specialists in a variety of different areas from software, to engines, to structures.

    “We are kind of like the FAA for naval aircraft,” says Meiser, referring to the Federal Aviation Administration. “We go through the whole test and certification process and also provide the air-worthiness authority. Once the system is tested and does what it needs to do, we also provide the support mechanism to have ongoing logistics and engineering support needed to maintain these aircraft for 20-50 years.”

    Design changes needed

    It takes approximately 15 years to build a new weapons system, such as a fighter jet, from idea to fruition. A key reason is due to increasing systems complexity. In the 1960s, the technology of a jet was largely based solely on the air vehicle itself. Today, everything is integrated with the aircraft ranging from how it flies, its targeting system, its weapons capabilities, the visual system, and more.

    “They are so much more complex in functionality and capabilities and it’s harder to develop and manage all of the requirements and interfaces,” says Systems Transformation Director Jaime Guerrero of NAVAIR’s Systems Engineering Development and Implementation Center. “You need a model-based approach to do that as opposed to a document-centric approach which has been how NAVAIR has operated for decades.” 

    Add to the pressure that NAVAIR leadership was mandating a cycle time collapse from 15 years to less than half that, David Cohen says. 

    “That’s where we need to be,” Cohen adds. “The threats we are trying to address with these weapons systems are evolving in a faster pace. We have to be a lot more agile in terms of getting a product to the fleet much faster.”

    In 2013, NAVAIR participated in a research effort with the DOD’s Systems Engineering Research Center (SERC) to learn how to find better and faster ways of systems engineering. After collaborating with industry partners, academia, and other government agencies, SERC determined that is was technically feasible to pursue modeling methods as the way forward in the future. Between 2014 and early 2016, NAVAIR engineering leadership researched modeling methods with its key industry partners like Boeing, Lockheed Martin, Raytheon, and 30 other companies to see how they were executing model methods, as well as those practiced in the auto industry where short design timelines are the norm. They also enlisted input from other government agencies that were already moving their processes to a model-centric method.

    “We absorbed a lot of information from these industries to see that we could use a different methodology to collapse cycle time,” Guerrero says.

    In those two years, NAVAIR researched 40-50 companies, universities, and government agencies and decided it was technically feasible for them to transform in about 10 years to be a different organization with different skills, tools, methods, and processes. They made the commitment to shift to model-based system engineering to incorporate this paradigm shift into its organization.

    Implementing model-based systems engineering

    Leadership, however, was not supportive of a 10-year transformational window. They wanted to aggressively compress the timeline.

    “When we realized leadership wanted to compress the timeline to about a three-year timeline for transforming the organization, we decided to go out and search experts and the best training we could get, the best tools in the market,” Guerrero recalls.

    They started searching for the resources needed to do that and attended workshops and symposiums. One of them was sponsored by NASA’s Jet Propulsion Laboratory, which was a few steps ahead in initiating a model-based systems engineering (MBSE) perspective. There, Meiser, and Guerrero learned of the MIT program from Bruce Cameron, director of the Systems Architecture Lab at MIT, who developed the coursework in 2016 and was also in attendance.   

    “Some of our partners, especially Boeing, were already involved with the MIT coursework and they recommended it,” says Guerrero. It had also become a command initiative at NAVAIR to push a fast transformation program. “So we had the command initiative and the resources to go out and train as many people as possible,” he says.

    NAVAIR committed to the courses as a way to establish a common language, to introduce its workforce to concepts, tools, and terminology that will foster deeper conversations that are going to be necessary to adopt MBSE concepts and advance the level of training.

    The entire four-course online program, which runs on the edX online learning platform, requires about 20 weeks for completion. Each course is gated with a weekly lesson which requires about 4-5 hours of work/week. It has a combination of videos, reading material, assessment and course work. At the end of each week, students are required to complete a project which is reviewed by peers.

    When Guerrero and Meiser completed the program in the spring of 2017, they realized it would help align NAVAIR’s leadership by educating its command leaders why modeling is part of the solution for them to become a more agile organization.

    “The four-course series provides a high-level explanation of how to do systems engineering and architecture in a model-based environment, Meiser says. “At the end of these courses you may not be a total practitioner of model-based engineering but you have an appreciation of the value of model based methods.”

    Management commitment from top leadership 

    “We came out of that and realized we needed to require a lot of our senior leaders here and some of our chief engineers because it is not about making them modelers or making them experts in the process,” adds Guerrero. “It’s about informing them of how this model-centric method is going to help us as an organization. Leaders have to be in agreement and push in the same direction to make this quick transformation happen.”

    Fortunately, NAVAIR’s top leadership was immediately on board.

    “What we have going for us at NAVAIR is that they’ve embraced MBSE and faster cycle times as a command initiative and they’ve committed to doing this comprehensively across NAVAIR,” says Meiser, adding they’ve been given the budget to pursue MBSE and top-level support.

    Vice Admiral Paul Grosklags, NAVAIR commander, even prepared a video discussing the path to going digital with acquisition, sustainment, and business processes and how it has the potential to increase readiness and speed to the fleet. Encouraged by that, Guerrero and Meiser produced their own YouTube video to help get the message out about the systems engineering transformation at NAVAIR.

    As a result, NAVAIR targets the MIT program toward management and command leaders across all of its engineering disciplines as well as logistics and testing, the people who have to facilitate the change. Though they are not the individuals responsible for doing the modeling, they are required to understand the capabilities of model-based systems engineering.

    Now that nearly 150 NAVAIR personnel have completed the program, the feedback has been very encouraging. Some with more experience believe it was a great reinforcement of what they knew or should have known. Others say it helped them understand certain MBSE aspects they were not previously familiar with.

    “We’ve given it to a fairly diverse group of people,” says Meiser. “One thing I had heard regularly is that people say once they’ve been through it that they look at the problem differently. That has been the effect we’ve wanted to have. They start to think more about how to approach the problems in a model-based approach.”

    Participants have also realized the value of pursuing this type of education together in the MIT program.

    “We have learned from others NOT to try to do this transformational work in isolation,” adds Meiser. “This discipline is fairly new and having access to others pursuing the same thing has been very helpful for us.”

    The leadership perspective

    Cohen appreciated the non-intrusive delivery method as well as the content, feeling that the on-site training provided a good balance of depth and instruction time. “It has been an integral first step, especially for bringing the broad workforce at large into the discussion of what MBSE is,” he says.

    Cohen knows NAVAIR is embarking on a monumental challenge. After completing the program himself, he realized he had to adjust his expectations.

    “It helped alert me to some of those cautionary areas where I could be considered more optimistic about my expectations,” he says. “Throughout the course, there was more emphasis on quality of the product, not just on rapid cycle time.”

    He was particularly impressed by the level of respect, knowledge, and professional experience demonstrated by others involved in the course.

    “I had to take on board and value the experience of people who have been working in this field a lot longer than we have,” he says. He admits the coursework tempered his aggressive expectations, but it simultaneously highlighted where NAVAIR needed to invest more research and resources in certain program areas to achieve the faster results expected by top leadership.

    Cohen credits the program with shaping the transformational process at NAVAIR by pointing out where they need to pursue deeper dives for the next level of depth in workforce training.

    “The course gives you the understanding that MBSE has layers to it,” he says. “So depending on where you are in the organization, you will need to get more in-depth training in your area. We found the course introduced everyone to the depth and breadth of what model-based engineering is, its applications and how it’s used.”

    At NAVAIR, the program has worked because they intentionally involve a large diversity of people across the organization rather than a few silos involving an entire group or department. They recommend that the program be taken by those in higher levels of an organization who are facilitating the engineering change. Those with more job-specific responsibilities should receive training specific to those precise areas they are going to be implement.

    “The courses have helped everyone understand the over-arching goal and establish a common language,” says Cohen. “Although the transition to model-based systems engineering is complicated, we have expanded our skills and contacts tremendously in the process and crystalized where we need to focus on to get results.”

August 2, 2018

  • When A. R. Rahman, two-time Academy Award winner, singer-songwriter, and music producer from India, came to visit and take a course at MIT in July, he was in his element during a tour of interactive music systems on campus.

    Anantha Chandrakasan, dean of the School of Engineering, led Rahman and his group to Building 24 where the small group of mostly non-musicians jammed together using their smartphones to sound off as brass, clarinet, percussion, or strings.

    Rahman tapped a sneakered foot to the beat. “This is fantastic,” says Rahman of the performance orchestrated by MIT professor of the practice Eran Egozy ’95, MEng ’95, who teaches, among other things, 21M.385 / 6.809 (Interactive Music Systems) — the first MIT music class that is also an electrical engineering and computer science class.

    These creative points of convergence are exciting, says Chandrakasan, who is also the Vannevar Bush Professor of Electrical Engineering and Computer Science. “There are tremendous opportunities to bring computing and artificial intelligence, sensing, and other technological advances to the world of music,” he says.

    Rahman’s own music is known to experiment with the fusion of traditional instruments with new electronic sounds and technology. Like Egozy, he is passionate about using technology to enhance the experience of listening to or making music and enabling people to engage with it.

    “You created games about things that are constructive not destructive.” Rahman says with a nod of approval to Egozy, co-founder and chief scientist of the company that brought the world “Guitar Hero” and “Rock Band.”

    A recipient of multiple Academy Awards, Rahman is especially interested in harnessing the power of technology in music to counter inequality, hate, and violence in social media and global discourse.

    Music and technology for the next generation

    Rahman was on a whirlwind MIT tour that involved visits with a string of creative academics in multiple realms: music, technology, artificial intelligence, machine learning, and robotics among them.

    His visit capped off a week during which Rahman dove into a four-day course offered by MIT Professional Education, “Advances in Imaging: VR-AR, Machine Learning, and Self-Driving Cars,” which is led by Ramesh Raskar, an associate professor of media arts and sciences at the MIT Media Lab.

    The course immersed participants in imaging and how cameras are used in machine learning, self-driving cars, health, industrial settings, and more. Rahman took it all in, says Raskar.

    “A.R. is focused on how to use imaging, machine learning, and AI not just for entertainment but to impart a sense of responsibility and cohesiveness and togetherness for the younger generation,” he says.

    “We were pleased to offer a course that could contribute to A.R.’s quest,” said Bhaskar Pant, executive director of MIT Professional Education. “His work in entertainment and education exposes enormous numbers of people to the latest technologies. That is something we want to support.”

    The tour stopped briefly on the green at Killian Court. “We are very happy to engage with A.R. here at MIT,” adds Chandrakasan, with a smile as Rahman’s family and friends snapped photographs in front of the Great Dome.

    “A.R.’s participation in the course was coupled to a larger discussion about the role of computing and music and the role technology, such as machine learning and vision, can have in helping people experience the benefits of making music and media,” says Chandrakasan.

    New tools for humanity

    Rahman’s next stop was for a presentation by Dina Katabi, the Andrew and Erna Viterbi Professor in the Department of Electrical Engineering and Computer Science. She has created a WiFi-like device that uses radio signals to monitor breathing, sleep, heart rate, gait, and detects falls.

    “This kind of technology is seamless and not intrusive at all,” says Rahman after the presentation. “Many people have complicated lives, but they love their parents and cannot take care of them in person. This is amazing.”

    Rahman was equally engaged by a demonstration of an autonomous wheelchair, an invention spearheaded by Daniela Rus, the Andrew (1956) and Erna Viterbi Professor of Electrical Engineering and Computer Science and director of MIT’s Computer Science and Artificial Intelligence Laboratory.

    Finally, Rahman was off to meet with composer Tod Machover, the Muriel R. Cooper Professor of Music and Media. “Today was fascinating,” says Rahman on his return to the Media Lab, which he toured earlier in the day.

    “I have a deep interest in music and how to bring technology to human emotion, how to conquer it to make beautiful things, to create emotions, to create beautiful songs,” he says. “But at heart, my interest is always in humanity. We need all kinds of new ideas and innovations that will help people.”

  • Constantinos (“Costis”) Daskalakis, an MIT professor in the Department of Electrical Engineering and Computer Science and principal investigator at the Computer Science and Artificial Intelligence Laboratory (CSAIL), has won the 2018 Rolf Nevanlinna Prize, one of the most prestigious international awards in mathematics.

    Announced today at the International Conference of Mathematicians in Brazil, the prize is awarded every four years (alongside the Fields Medal) to a scientist under 40 who has made major contributions to the mathematical aspects of computer science.

    Daskalakis was honored by the International Mathematical Union (IMU) for “transforming our understanding of the computational complexity of fundamental problems in markets, auctions, equilibria, and other economic structures.” The award comes with a monetary prize of 10,000 euros.

    “Costis combines amazing technical virtuosity with the rare gift of choosing to work on problems that are both fundamental and complex,” said CSAIL Director Daniela Rus. “We are all so happy to hear about this well-deserved recognition for our colleague.”

    A native of Greece, Daskalakis received his undergraduate degree from the National Technical University of Athens and his PhD in electrical engineering and computer sciences from the University of California at Berkeley. He has previously received such honors as the 2008 ACM Doctoral Dissertation Award, the 2010 Sloan Fellowship in Computer Science, the Simo Simons Investigator Award, and the Kalai Game Theory and Computer Science Prize from the Game Theory Society.

    Created in 1981 by the Executive Committee of the IMU, the prize is named after the Finnish mathematician Rolf Nevanlinna. The prize is awarded for outstanding contributions on the mathematical aspects of informational sciences. Recipients are invited to participate in the Heidelberg Laureate Forum, an annual networking event that also includes recipients of the ACM A.M. Turing Award, the Abel Prize, and the Fields Medal.