skip to: onlinetools | mainnavigation | content | footer

News

SANDIA LAB NEWS

Lab News -- December 19, 2008

December 19, 2008

LabNews 12/19/2008PDF (1 Mb)

Sandia counterintuitive simulation: After a certain point, more chip cores mean slower supercomputing

By Neal Singer

Simulations by Sandia researchers show that the worldwide attempt to increase computer speed on the most complex problems merely by increasing the number of processor cores on individual chips unexpectedly reaches a quick dead end for many applications important to Sandia’s missions.

A Sandia simulation graph that modeled algorithms important for network discovery — the ability of a processor to find and be seen by other devices on the network — shows a significant increase in speed going from two to four multicores, but an insignificant increase from four to eight multicores. Exceeding eight multicores causes a decline. Sixteen multicores perform barely as well as two. Steeper dropoffs are registered as more cores are added.

The problem is the lack of memory bandwidth, as well as contention between processors over the memory bus available to each processor. (The memory bus is the set of wires used to carry memory addresses and data to and from the system RAM.)

To use a supermarket analogy, if two clerks at the same checkout counter are processing your food instead of one, the checkout process should go faster.

Or, you could be served by four clerks. Or eight clerks. Or sixteen. And so on.

The problem is, if each clerk doesn’t have access to the groceries, he or she doesn’t necessarily help the process. Worse, the clerks may get in each other’s way.

Similarly, it seems a no-brainer that if one core is fast, two would be faster, four still faster, and so on.

But the lack of immediate access to individualized memory caches — the “food” of each processor — slows the process down instead of speeding it up once the number of cores exceeds eight, according to a simulation of high-performance computers by Richard Murphy, Arun Rodrigues (both 14222), and former student Megan Vance. A graph of the simulation was published in the Nov. 8 online issue of IEEE Spectrum.

“To some extent, it is pointing out the obvious — many of our applications have been memory-bandwidth-limited even on a single core,” says Arun. “However, it is not an issue to which industry has a known solution, and the problem is often ignored.”

“The difficulty is contention among modules,” says James Peery, director of Sandia’s Computations, Computers, Information and Mathematics Center (1400). “The cores are all asking for memory through the same pipe. It’s like having one, two, four, or eight people all talking to you at the same time, saying, ‘I want this information.’ Then they have to wait until the answer to their request comes back. This causes delays.”

“The original AMD processors in Red Storm were chosen because they had better memory performance than other processors, including other Opteron processors,” says Ron Brightwell (1422). “One of the main reasons that AMD processors are popular in high-performance computing is that they have an integrated memory controller that, until very recently, Intel processors didn’t have.”

Multicore technologies are considered a possible savior of Moore’s Law, the prediction that the number of transistors that can be placed inexpensively on an integrated circuit will double approximately every two years.

“Multicore gives chip manufacturers something to do with the extra transistors successfully predicted by Moore’s Law,” Arun says. “The bottleneck now is getting the data off the chip to or from memory or the network.”

A more natural goal of researchers would be to increase the clock speed of single cores, since the vast majority of applications are designed for single-core performance on word processors, music, and video applications. But power consumption, increased heat, and basic laws of physics involving parasitic currents meant that designers were reaching their limit in improving chip speed for common silicon processors.

“The [chip design] community didn’t go with multicores because they were without flaw,” says Mike Heroux (1416). “The community couldn’t see a better approach. It was desperate. Currently we are seeing memory system designs that provide a dramatic improvement over what was available 12 months ago, but the fundamental problem still exists.”

In the early days of supercomputing, Seymour Cray produced a superchip that processed information faster than any other chip. Then a movement — led in part by Sandia — proved that ordinary chips, programmed to work different parts of a problem at the same time, could solve complex problems faster than the most powerful superchip. Sandia’s Paragon supercomputer, in fact, was the world’s first parallel processing supercomputer.

Today, Sandia has a large investment in message-passing programs. Its Institute for Advanced Architectures, operated jointly with Oak Ridge National Laboratory (ORNL) and intended to prepare the way for exaflop computing, may help solve the multichip dilemma.

ORNL’s Jaguar supercomputer, currently the world’s fastest for scientific computing, is a Cray XT model based on technology developed by Sandia and Cray for Sandia’s Red Storm supercomputer, whose original and unique design is the most copied of all supercomputer architectures.-- Neal Singer

Top of page
Return to Lab News home page


Sometimes walls aren’t the answer

Sandia group provides access delay options for high-value facilities

By Stephanie Holinka

When most people think about physical security, they think about building walls to keep outsiders out of an area. Turns out walls aren’t always the answer.

“Solid walls aren’t always a good solution,” says Dave Swahlan, manager of Active Response and Denial Dept. 6475. “Any kind of solid barrier can provide cover for bad guys,” he says, which can make it harder to distinguish the early stages of an attack in time to mount a response.

In the past four and a half years, Sandia has conducted some 28 vehicle barrier-type tests at the Texas Transportation Institute in College Station, Texas. Sandia conducts tests for the Department of State on systems intended for perimeter security, checkpoints, and other security concerns.

Sandia has tested many different items and configurations such as barriers using large concrete blocks, ballards, sections of walls, Jersey barriers in multiple variations, and trucks configured and parked to serve as temporary barriers.

The Department of State has many facilities that require physical security specialized for their location and for the unique concerns of the facility and area. “Some facilities may only have a sidewalk-sized area for a barrier,” Dave says. “In others, utilities and footings in already-existing locations can impact how large a barrier can be.” He says that even facilities with limited space must still try to prevent facility breaches from big vehicles such as dump trucks.

The barrier system most recently tested used a modified box beam with additional items inside the beam to provide additional delay should an adversary try to breach the barrier by cutting it open.

During the test, a large truck was pulled forward and up to speed using a tow truck and pull system that disengaged at the last second after the vehicle had reached its target speed. Instruments and video footage are later used to analyze how the barrier behaves during an attempted breach.

Sometimes a barrier’s movement is an important part of the design. “Most people don’t realize that movement is important in barriers,” Dave says. “High deceleration loads with a rigid barrier can still sometimes bring the load into the area you’re trying to protect.” Stopping a truck too quickly, he says, can lead to bed-shearing, where the load continues forward from the truck.

Test engineer Mark McAllaster (6475) has been integrally involved in all the tests at College Station, from the design process to the logistics of arranging, funding, and purchasing testing equipment. He’s also responsible for assuring the test components were correctly manufactured, installed, and tested according to the proper protocol to make the design eligible for certification according to an ASTM F2656-07 standard.

“Each test allows engineers to modify designs to potentially improve a barrier’s performance,” Mark says.

Mark says the most recent barrier tested met its design goal, adding that “engineers always hope that a design can be improved upon.” Mark says the test’s biggest surprise happened after the crash test, when the barrier proved more resistant to common cutting techniques than originally estimated.

Sandia is contracted by the Department of State through FY09 for both new design and testing. The State Department is interested in creating a portfolio of generic designs that it can contract out and have built for deployment. Sandia expects to design six generic barriers and test each one. Those designs may be gateway designs or perimeter designs depending on what the State Department decides it needs for the near future.

Another purpose of the tests is to create a collection of barrier designs that are precisely understood, so that the Department of State can have them built any time it requires new barriers, rather than relying on off-the-shelf systems.

In the past, the Department of State ranked barriers using “K-ratings,” which set up criteria for a barrier’s breachability. But the Department is replacing the “K-ratings” with the ASTM standard so many organizations can compare barriers using the same standards and the results of tests will be interpreted the same way. Right now, that testing information is not all kept in the same place. Sandia hopes the standardization of information will create a single standard and a single repository for all the information.

The testing group had company for its most recent test. More than 40 home-schooled students observed the testing. Mark says the kids wanted to know the purpose for the testing. “They asked many very good questions,” Mark says, “and we talked about the purpose and the logistics of setting up the tests.”

The test was explained to them in advance so students would know what to look for. Mark also wanted the students to know how the truck was being propelled, so they would be assured there was not a driver who could be injured. Mark says the most enjoyable part was telling them “do not do this at home with your Dad’s truck.”

Mark says the kids were amazed by the crash of the truck into the barrier, adding that there was a lot of yelling and clapping as the truck plowed into the barrier. After the test, the students yelled, “Cool, we want to see you do it again!” — Stephanie Holinka

Top of page
Return to Lab News home page


National Security Agency honors Navid Jam for videoconferencing security work

By Patti Koning

This fall, Navid Jam (8965) was chosen as one of three finalists for the National Security Agency's (NSA) 2008 Frank B. Rowlett Award, which recognizes outstanding organizational and individual excellence in the field of information systems security.

“Navid represented a team that found critical vulnerabilities in common off-the-shelf videoconference systems that are widely used by government agencies,” says Len Napolitano, director of Computer Sciences and Information Systems Center 8900. “They identified the problem and then Navid carried the message outside the Labs. Now the government is using their work as a standard, to the point that they are holding up procurements.”

Navid was awarded an honorable mention for his contributions, but this is a case in which being a finalist is truly a remarkable achievement. The Rowlett Award typically stays within the armed services. Navid was the only nonuniformed individual nominee this year; never has there been an individual winner from outside the Department of Defense (DoD).

Navid says that at the ceremony he was surprised and pleased by the positive feedback he received from people in high-level positions with a variety of government agencies.

“Seeing the breadth and depth of customers and the impact we have had was rewarding,” he says. “I think this shows the recognition of the important role national laboratories play in information assurance and cyber security for the nation and also recognition of this new, important area in which a national lab has had an impact across many areas including the government, vendors, and the standards community.”

Navid and Len attended the awards ceremony in Washington, D.C., on Oct. 30. Before the ceremony began, all six nominees sat down privately with NSA Director Lt. Gen. Keith Alexander to discuss their work. The ceremony itself featured a four-minute video on the work of each nominee.

Sandia’s videoconferencing work began four years ago as an internal, operational project. Corbin Stewart, a technologist in Videoconference and Collaborative Technologies Dept. 8947, discovered a security issue with videoconferencing software.

“I was trying to update a feature and discovered that it did not require authentication,” he says. “That was a red flag.”

Jim Berry (8944), manager of Dept. 8947 at the time, decided the potential security issues surrounding videoconferencing were worth investigating. Using funding from Jim’s department, Corbin, Steve Hurd (8965), a Sandia computer scientist and program manager for the Labs’ Center for Cyber Defenders (CCD), and a group of college interns in the CCD initiated a risk analysis nearly four years ago that focused on commercial codecs. Navid was one of those interns.

“He really stood out and took the lead to advertise these types of vulnerabilities in embedded communication applications,” says Corbin. “It’s gratifying to see that the work we have done is benefitting so many others.”

He adds that while one reason for turning to the CCD was because as interns, they were inexpensive, it wound up bringing new, fresh insights to the problem.

Another former CCD student, Elliot Proebstel (8965), has also contributed to the work.

“We are really impressed with Elliot,” says Corbin. “During his second summer with the CCD, he was able to circumvent access controls on the evaluation device and recover the administrative password in only a few hours.

Sandia’s primary concern was analyzing and mitigating security risks on its own network, risks addressed and rectified immediately after security holes were found. The research team evaluated hardware from several industry sources. After analyzing the devices and related hardware and software, the team developed “attack trees” (step-by-step tactics) and performed a variety of attacks to demonstrate vulnerabilities. The objective was to attempt system compromises, independently assess vulnerabilities that were found to exist, and develop “best practices” and tools to aid users.

This internal project developed into a full-fledged program with work for others funding from vendors of videoconferencing and embedded communications systems and government agencies using such systems.

“For external customers, we approach the problem from two perspectives,” says Navid. “Sandia acts as a consultant for various government agencies, advising them on architecture, setups, and potential vulnerabilities. We also work with vendors to help them analyze some of the risks and vulnerabilities apparent with their systems.”

Getting to this point was not easy. Navid recalls several years when no one would take him seriously.

“When we raised these concerns, they fell on deaf ears initially,” he says. “We didn’t give up and began sharing what we had learned with our government contacts and partners. We felt strongly that this was an important issue that needed to be addressed, not swept under the rug. No one was concerned about it and the vendors weren’t listening to Sandia. That was something that was putting the nation at risk.”

Not rocket science

Mitigating the risks is actually fairly simple. A problem, explains Corbin, is that people often don’t think of these videoconferencing devices as computers, and having the same kind of web services and FTP services as your computer, and as a result, the same need for security.

“Some vendors continue to tout the latest features, benefits, and productivity gains that videoconferencing technology offers, but not enough thought or effort has been placed in securing these devices,” says Navid. “The irony is that it isn’t rocket science. It’s really akin to home PCs. By now, most people connected to the Internet understand the need to have antivirus software. Similarly, people responsible for videoconferencing events should understand that a videoconferencing device operates much like a PC and as such requires protection such as a firewall program. But there are a lot of companies out there who overlook that need.”

Navid began giving talks at DOE, DoD, and public Internet security conferences. He says he talked to anyone and everyone willing to learn about the risks of embedded collaboration systems. A breakthrough came in 2006 at a Sandia Red Team conference, where he met with the NSA Red Team and shared the issues Sandia had identified and the tools and techniques to mitigate those issues.

“The NSA Red Team has been a terrific partner. They really championed our cause and helped us gain fairly high visibility throughout the government,” says Navid. Recently the Defense Information Systems Agency (DISA) released a Security Technical Implementation Guide (STIG) for videoconferencing based on Sandia’s work.

Sandia set up the Center for Collaborative Security (CCS) to educate companies and organizations about videoconferencing vulnerabilities and the need for industry-wide fixes. This virtual team comprises a wide swath of computer and security experts, including network operators, vulnerability researchers, IT architects, and systems analysts and spans many directorates across Sandia, including 8900, 8100, and 5600.

The CCS conducts research and development on security issues related to collaboration systems such as distributed information-sharing applications and instant messaging solutions. It provides basic tools and information on security vulnerabilities found in all types of collaboration devices, as well as best practices to enhance the security of collaborative systems. The CCS also provides a method for external partners to establish work-for-others business agreements with Sandia, which can perform company-specific evaluations and assessments.

Navid continues to drive broad acceptance of Sandia’s work, working with standards bodies and meeting with more potential customers.

He’s enjoying his role, which could be described as a spokesman, salesman, interface, or as he jokes, “just the pretty face.” Navid says his key strength is being able to bridge both the technical and business aspects and understanding all the issues that come to play — understanding how the business works. — Patti Koning

Top of page
Return to Lab News home page