Accessibility Information

Users of assistive technologies such as screen readers should use the following link to activate Accessibility Mode before continuing: Learn more and Activate accessibility mode.

A -- Software and Systems Test Track

Solicitation Number: Reference-Number-BAA-06-13-IFKA
Agency: Department of the Air Force
Office: Air Force Materiel Command
Location: AFRL - Rome Research Site
  • Print
:
Reference-Number-BAA-06-13-IFKA
:
Presolicitation
:
Added: March 6, 2006
FUNDING OPPORTUNITY NUMBER: BAA #06-13-IFKA

CFDA Number: 12.800



DATES: It is recommended that white papers be received by the following dates to maximize the possibility of award: FY 06 should be submitted by 30 March 2006; FY 07 by 31 January 2007; FY 08 by 31 October 2007; FY 09 by 31 October 2008 and, FY 10 by 31 October 2009. White papers will be accepted until 2:00 p.m. Eastern time on 30 March 2010, but it is less likely that funding will be available in each respective fiscal year after the dates cited. FORMAL PROPOSALS ARE NOT BEING REQUESTED AT THIS TIME. See Section IV of this announcement for further details.



I. FUNDING OPPORTUNITY DESCRIPTION:



Background:

It is increasingly difficult to create software to deal successfully with increased device and system complexity. Moreover, networked distributed systems create vast increases in the scale and scope of information processing applications, exacerbating the challenges to the system engineers' ability to specify, design, build, verify and test software. This situation is an emerging issue in information technology in general, but the requirement of military systems set them sharply apart from non–military applications in terms of reliability, robustness, security, interoperability and real time operation.



Business, government, and technical endeavors ranging from financial transactions to space missions increasingly require complex software systems to function correctly. The complexity of the software arises from stringent requirements (e.g., for reliability in performance and integrity of the data used), the need to support a range of interactions with the environment in real time, and/or certain structural features. These attributes make software difficult to produce.



Major hardware-software failures in defense acquisition programs have occurred both because components were expected to interoperate properly and they did not, as well as the fact that tools did not work as advertised. Interoperability is required for network centric environments but reality has shown it very difficult to achieve. A scalable, flexible, realistic synthetic testing environment is required for stressing tools against key benchmarks, assessment of the utility of tools by key program offices as well as use as a synthetic environment for testing tools against large systems (tens of millions of SLOC or larger) or systems-of-systems.



Objective:



This program will be acquired in two Phases. Phase I will be the definition phase and will consist of defining, developing, and documenting the Concept of Operations (CONOPS), a user-oriented document that describes system characteristics for a proposed system from the users' viewpoint; and Defining, developing and documenting the architecture and the fundamental organization of the Systems and Software Test Track as embodied in its components, their relationships to each other and the environment, and the principles governing its design and evolution. Initial concepts along with the final CONOPS and architectures will be presented to representatives from Government, Industry and Academia and therefore, the technical data rights proposed need to be consistent with this requirement. Phase I is the area that we are soliciting white papers for now.



Phase II will be the environmental development and operations phase. Additional information will be issued around the January 2007 timeframe concerning Phase II and white papers for Phase II will be solicited at that time.



The overall objective of this Systems and Software Test Track BAA program is to provide an open framework environment where an assortment of experimental tools and products may be deployed and allowed to interact in real-time, interactive, evolutionary and interdependent means, thereby allowing rigorous testing of new technologies, methodologies and theories in support of the Software-Intensive Systems Producibility Initiative. The Systems and Software Test Track will facilitate testing of Software-Intensive Systems Producibility research products and methods, provide an environment for research of DoD embedded systems and software problems, provide an ability for university and industry leverage of technology development, and establish a capability for successful technology transition and transfer.



The environment should be open and available for use by developers as well as independent analysis by the facility operators. This independent analysis allows the facility operators to be supportive to major defense acquisition program offices as well as analyzing the utility of tools. Program offices may bring their unsolved problems to the test track for either help in solving or by looking for needed utility amongst the tools available. Lastly, this synthetic environment should provide a place where big codes can be tested in the loop allowing requirements verification prior to production and deployment. This risk reduction affords the ability to verify and validate functionality of today's complex software intensive systems while providing a realistic environment for researchers to verify their tools against realistic problems.



The Systems and Software Test Track should also provide a place (possibly virtual and not a single physical location) for experimental verification of Software-Intensive Systems Producibility technologies due to their novelty and the potential complexity of the underlying theories. The experimental platforms should incorporate software technology to instrument, monitor and test large-scale applications. Challenge problems for the open experimental platforms should be made accessible for all the research teams. The experimental platform research should include subtasks to conduct large-scale coordination experiments, and to develop methods and tools for evaluating aggregate performance of applications. This environment should provide a full range of collaborative technology challenges, run-time platforms and applications, experiments, evaluations, and demonstrations. A Common infrastructure will enable control and data flow between both kinds of application components for a distributed environment. The open experimentation environment will provide the fundamental reference architecture and underpinnings helping researchers to develop and test their designs as well as facilitates transition of promising technologies into production use.



Research Concentration Areas:

The goal of the Phase 1 research is to A) Define, develop, and document the Concept of Operations (CONOPS), a user-oriented document that describes system characteristics for a proposed system from the users' viewpoint; and B) Define, develop and document the architecture and the fundamental organization of the Systems and Software Test Track as embodied in its components, their relationships to each other and the environment, and the principles governing its design and evolution.



The CONOPS document should be written to communicate overall quantitative and qualitative system characteristics to the user, buyer, developer, and other organizational elements. It should describe the system, operational policies, classes of users, interactions among users, and organizational objectives from an integrated systems point of view.



Areas to consider when developing the CONOPS include, but are not limited to:



Intellectual property issues;



International Traffic in Arms Regulations (ITAR) and export control issues;



Tool Evaluation;



Features and methods to promote and monitor practitioner and researcher interaction as well as providing academic researchers access to large complex software systems;



Technology Transition/Transfer of improved embedded software component integration tools and techniques into military system development programs;



Supporting facilities and staffing, location(s), networking, firewall and policy related issues.



The Architecture definition should provide the ability to understand and control those elements of system design that capture the system's utility, cost, and risk. These elements could be the physical components of the system and their relationships, the logical components or enduring principles or patterns that create enduring structures. The definition shall provide a rigorous definition of what constitutes the fundamental organization of the Systems and Software Test Track embodying all information regarding elemental relations, interactions and interfaces.



Areas to consider when developing the Architecture include, but are not limited to:



The environment, virtual or not, to host, execute and test technology, both hardware environment (run time platform) and software environment (run time platform);



Interfaces and collaborations;



Mechanisms to support research and analysis of DoD problems including, but not limited to: software artifacts, benchmarks, executables, source code, design documents, requirements documents, examples, models, fault data, lessons learned, software construction files and tools;



Data repositories for results, success stories, benchmarks, quantitative and qualitative results, software disaster studies defining problems and basic research areas, ability to upload/download artifacts;



Measurement Techniques and Software Forensics;



Metrics including reduced development time; ease in which domain experts and software engineers can interact; ease in which different domain experts can specify and design code independently of one another; usability, i.e., the 'naturalness' of the modeling language from which code is generated; ability of test engineers to modify and tune code in the field; accurate automated documentation in design

Mechanism for studying innovative systems and dynamic processes, not static snapshots



References: Institute for Electrical and Electronics Engineers, IEEE Guide for Information Technology-System Definition-Concept of Operations (CONOPS) Document. IEEE Std 1362-1998, IEEE Computer Society Press, 1998.



Anticipated Schedule: Phase 1 – The Definition Phase will begin with a kickoff workshop and proceed with a several month CONOPS and Architecture development for the Systems and Software Test Track. A review workshop will be held mid-way through the phase I effort where the initial concepts can be briefed to representatives from Government, Industry and Academia invited to review the ideas and provide feedback. The final months should be devoted to completing the initial concepts and incorporating comments from the review workshop. A final review workshop should be held in the final month where the final CONOPS and Architectures can be presented to representatives from Government, Industry and Academia invited to review the ideas and provide feedback. Phase 1 awardees will have to present the full extent of their work at a mid-term and final workshop, attendance at which will not be limited to only the Government and awardees. Also, the Phase 1 work will provide the basis for the phase 2 solicitation, and be used both by the Government for definition of the Phase 2 addendum, and as a repository available for phase 2 offerors use. Therefore, the technical data rights proposed need to be consistent with these requirements.



Phase 2 –Additional information will be provided as a modification to this BAA to initiate the phase 2 which will consist of the Development and Operations phase. This modification is anticipated to be issued in January 2007.





II. AWARD INFORMATION:



Total funding for this BAA is approximately $18M. The anticipated funding to be obligated under this BAA is broken out by fiscal year as follows: FY 06 - $1.0M; FY 07 - $1.0M; FY 08 - $6.1M; FY 09 - $4.9M; and FY10 - $5.0M. Individual awards will not normally exceed 6 months with dollar amounts ranging between $300K to $400K per year for Phase 1 and will not normally exceed 18 months with dollar amounts ranging between $700K and $3.0M per year for Phase II. Awards of efforts as a result of this announcement will be in the form of contracts, grants, cooperative agreements, or other transactions depending upon the nature of the work proposed.



III. ELIGIBILITY INFORMATION:



1. ELIGIBLE APPLICANTS: All potential applicants are eligible. Foreign or foreign-owned offerors are advised that their participation is subject to foreign disclosure review procedures. Foreign or foreign-owned offerors should immediately contact the contracting office focal point, Lori L. Smith, Contracting Officer, telephone (315) 330-1955 or e-mail Lori.Smith@rl.af.mil for information if they contemplate responding. The e-mail must reference the title and BAA 06-13-IFKA.



2. COST SHARING OR MATCHING: Cost sharing is not a requirement.



IV. APPLICATION AND SUBMISSION INFORMATION:



1. APPLICATION PACKAGE: THIS ANNOUNCEMENT CONSTITUTES THE ONLY SOLICITATION. WE ARE SOLICITING WHITE PAPERS ONLY. DO NOT SUBMIT A FORMAL PROPOSAL AT THIS TIME. Those white papers found to be consistent with the intent of this BAA may be invited to submit a technical and cost proposal. See Section VI of this announcement for further details.



For additional information, a copy of the AFRL/Rome Research Sites "Broad Agency Announcement (BAA): A Guide for Industry," Aug 2005, may be accessed at: http://www.if.afrl.af.mil/div/IFK/bp-guide.doc.



2. CONTENT AND FORM OF SUBMISSION: Offerors are required to submit 4 copies of a 3 to 5 page white paper summarizing their proposed approach/solution. The purpose of the white paper is to preclude unwarranted effort on the part of an offeror whose proposed work is not of interest to the Government. The white paper will be formatted as follows: Section A: Title, Period of Performance, Estimated Cost of Task, Name/Address of Company, Technical and Contracting Points of Contact (phone, fax and email); Section B: Task Objective; and Section C: Technical Summary and Proposed Deliverables. Multiple white papers within the purview of this announcement may be submitted by each offeror. If the offeror wishes to restrict its white papers/proposals, they must be marked with the restrictive language stated in FAR 15.609(a) and (b). All white papers/proposals shall be double spaced with a font no smaller than 12 pitch. In addition, respondents are requested to provide their Commercial and Government Entity (CAGE) number, a fax number, and an e-mail address with their submission. All responses to this announcement must be addressed to the technical POC, as discussed in paragraph five of this section.



3. SUBMISSION DATES AND TIMES: It is recommended that white papers be received by the following dates to maximize the possibility of award: FY 06 should be submitted by 30 March 2006; FY 07 by 31 January 2007; FY 08 by 31 October 2007; FY 09 by 31 October 2008 and, FY 10 by 31 October 2009. White papers will be accepted until 2pm Eastern time on 30 March 2010, but it is less likely that funding will be available in each respective fiscal year after the dates cited. Submission of white papers will be regulated in accordance with FAR 15.208.



4. FUNDING RESTRICTIONS: The cost of preparing white papers/proposals in response to this announcement is not considered an allowable direct charge to any resulting contract or any other contract, but may be an allowable expense to the normal bid and proposal indirect cost specified in FAR 31.205-18. Incurring pre-award costs for ASSISTANCE INSTRUMENTS ONLY, are regulated by the DoD Grant and Agreements Regulations (DODGARS).



5. OTHER SUBMISSION REQUIREMENTS: DO NOT send white papers to the Contracting Officer. All responses to this announcement must be addressed to:



Department of the Air Force, Air Force Material Command

AFRL/IFTC

525 Brooks Road, Rome, NY 13441-4505

Attn: Mr. Steven Drager





Respondents are required to provide their Dun & Bradstreet (D&B) Data Universal Numbering System (DUNS) number with their submittal and reference BAA 06-13-IFKA.



Electronic submission to Steven.Drager@rl.af.mil will also be accepted (if submitting electronically, you do not need to follow up by sending in 4 hard copies).



V. APPLICATION REVIEW INFORMATION:



1. CRITERIA: The following criteria, which are listed in descending order of importance, will be used to determine whether white papers and proposals submitted are consistent with the intent of this BAA and of interest to the Government: (1) Overall Scientific and Technical Merit -- Including the approach for the development of the CONOPS and architecture, (2) Related Experience - The extent to which the offeror demonstrates relevant technology and domain knowledge, (3) Maturity of Solution - The extent to which existing capabilities and standards are leveraged and the relative maturity of the proposed technology in terms of reliability and robustness, and (4) Reasonableness and realism of proposed costs and fees (if any). Also, consideration will be given to past and present performance on recent Government contracts, and the capacity and capability to achieve the objectives of this BAA. No further evaluation criteria will be used in selecting white papers/proposals. Individual white paper/proposal evaluations will be evaluated against the evaluation criteria without regard to other white papers and proposals submitted under this BAA. White papers and proposals submitted will be evaluated as they are received.



2. REVIEW AND SELECTION PROCESS: Only Government employees will evaluate the white papers/proposals for selection. The Air Force Research Laboratory's Information Directorate has contracted for various business and staff support services, some of which require contractors to obtain administrative access to proprietary information submitted by other contractors. Administrative access is defined as "handling or having physical control over information for the sole purpose of accomplishing the administrative functions specified in the administrative support contract, which do not require the review, reading, or comprehension of the content of the information on the part of non-technical professionals assigned to accomplish the specified administrative tasks." These contractors have signed general non-disclosure agreements and organizational conflict of interest statements. The required administrative access will be granted to non-technical professionals. Examples of the administrative tasks performed include: a. Assembling and organizing information for R&D case files; b. Accessing library files for use by government personnel; and c. Handling and administration of proposals, contracts, contract funding and queries. Any objection to administrative access must be in writing to the Contracting Officer and shall include a detailed statement of the basis for the objection.



VI. AWARD ADMINISTRATION INFORMATION:



1. AWARD NOTICES: Those white papers found to be consistent with the intent of this BAA may be invited to submit a technical and cost proposal. Notification by email or letter will be sent by the technical POC. Such invitation does not assure that the submitting organization will be awarded a contract. Those white papers not selected to submit a proposal will be notified in the same manner. Prospective offerors are advised that only Contracting Officers are legally authorized to commit the Government.



All offerors submitting white papers will be contacted by the technical POC, referenced in Section VII of this announcement. Offerors can email the technical POC for status of their white paper/proposal no earlier than 45 days after proposal submission.



2. ADMINISTRATIVE AND NATIONAL POLICY REQUIREMENTS:



Depending on the work to be performed, the offeror may require a TOP SECRET facility clearance and safeguarding capability; therefore, personnel identified for assignment to a classified effort must be cleared for access to TOP SECRET information at the time of award. In addition, the offeror may be required to have, or have access to, a certified and Government-approved facility to support work under this BAA. Data subject to export control constraints may be involved and only firms holding certification under the US/Canada Joint Certification Program (JCP) (www.dlis.dla.mil/jcp) are allowed access to such data.



A phase 1 awardee will have to present the full extent of their work at a mid-term and final workshop, attendance at which will not be limited to only the Government and awardees. Also, the Phase 1 work will provide the basis for the phase 2 solicitation, and be used both by the Government for definition of the Phase 2 addendum, and as a repository available for phase 2 offerors use. Therefore, the technical data rights proposed need to be consistent with these requirements.



3. REPORTING: Once a proposal has been selected for award, offeror's will be required to submit their reporting requirement through one of our web-based, reporting systems known as JIFFY or TFIMS. Prior to award, the offeror will be notified which reporting system they are to use, and will be given complete instructions regarding its use.



VII. AGENCY CONTACTS:



Questions of a technical nature shall be directed to the cognizant technical point of contact, as specified below:



TPOC Name: Steven Drager

Telephone: (315) 330-2735

Email: Steven.Drager@rl.af.mil



Questions of a contractual/business nature shall be directed to the cognizant contracting officer, as specified below:



Lori Smith

Telephone (315) 330-1955

Email: Lori.Smith@rl.af.mil



The email must reference the solicitation (BAA) number and title of the acquisition.



In accordance with AFFARS 5315.90, an Ombudsman has been appointed to hear and facilitate the resolution of concerns from offerors, potential offerors, and others for this acquisition announcement. Before consulting with an ombudsman, interested parties must first address their concerns, issues, disagreements, and/or recommendations to the contracting officer for resolution. AFFARS Clause 5352.201-9101 Ombudsman (Aug 2005) will be incorporated into all contracts awarded under this BAA. The AFRL Ombudsman is as follows:



Jeffrey E. Schmidt

Colonel, USAF

Director of Contracting

(937) 255-0432 (voice)

(937) 255-5036 (fax)





All responsible organizations may submit a white paper which shall be considered.

Added: Jun 10, 2008 2:03 pm
The purpose of this modification is to change the contractual POC listed in the announcement. The new Contracting POC is Lynn White. She can be reached at Lynn.White@rl.af.mil or 315-330-4996.



All questions prior to the submission of a technical and cost proposal, including inquiries regarding white papers should still be addressed to the technical POC listed in the announcement.



All other information remains the same.

Added: Jul 08, 2008 3:07 pm
The purpose of this modification is to republish the original announcement pursuant to FAR 35.016(c). This republishing also includes the following changes: (a) changes the evaluation criteria; (b) changes the NAICS Code and (c) allows foreign participation for certain countries. No other changes have been made.



NAICS CODE: 541712



FEDERAL AGENCY NAME: Department of the Air Force, Air Force Materiel Command, AFRL - Rome Research Site, AFRL/Information Directorate, 26 Electronic Parkway, Rome, NY, 13441-4514



TITLE: Software and Systems Test Track



ANNOUNCEMENT TYPE: Initial announcement



FUNDING OPPORTUNITY NUMBER: BAA #06-13-IFKA



CFDA Number: 12.800



DATES: It is recommended that white papers be received by the following dates to maximize the possibility of award: FY 07 by 31 May 2007; FY 08 by 31 December 2007; FY 09 by 31 December 2008 and, FY 10 by 31 December 2009. White papers will be accepted until 2:00 p.m. Eastern time on 30 March 2010, but it is less likely that funding will be available in each respective fiscal year after the dates cited. FORMAL PROPOSALS ARE NOT BEING REQUESTED AT THIS TIME. See Section IV of this announcement for further details.



I. FUNDING OPPORTUNITY DESCRIPTION:



Background:

It is increasingly difficult to create software to deal successfully with increased device and system complexity. Moreover, networked distributed systems create vast increases in the scale and scope of information processing applications, exacerbating the challenges to the system engineers’ ability to specify, design, build, verify and test software. This situation is an emerging issue in information technology in general, but the requirement of military systems set them sharply apart from non-military applications in terms of reliability, robustness, security, interoperability and real time operation.



Business, government, and technical endeavors ranging from financial transactions to space missions increasingly require complex software systems to function correctly. The complexity of the software arises from stringent requirements (e.g., for reliability in performance and integrity of the data used), the need to support a range of interactions with the environment in real time, and/or certain structural features. These attributes make software difficult to produce.



Major hardware-software failures in defense acquisition programs have occurred both because components were expected to interoperate properly and they did not, as well as the fact that tools did not work as advertised. Interoperability is required for network centric environments but reality has shown it very difficult to achieve. A scalable, flexible, realistic synthetic testing environment is required for stressing tools against key benchmarks, assessment of the utility of tools by key program offices as well as use as a synthetic environment for testing tools against large systems (tens of millions of SLOC or larger) or systems-of-systems.



Software intensive systems are critical to meeting demands of current and future warfighting capabilities. The costs of development of these systems -- in terms of dollars, time and human resources are increasing rapidly.



• Approximately 40% of DoD’s RTD&E budget is currently spent on software development, and this is expected to increase.



• “The (Army FCS) software task alone is five times larger than that required for Joint Strike Fighter and ten times larger than the F-22, which after two decades is finally meeting its software requirements.” -- Congressman Curt Weldon (April, 2004)



The complexity of current and emerging systems and systems of systems (SoS) requires breakthrough approaches in system development including new tools and methodologies spanning the entire system lifecycle from architecture design to system and SoS verification. One of the risk factors affecting the transitionability of such research is the difficulty of validating and verifying the functionality of research technologies against realistic problems.



The poor collaboration among people working across the technology maturity lifecycle has created a “valley of disappointment” where DoD programs fail to adopt advanced technologies, regardless of their inherent promise. A regime of ad hoc policies and procedures for transitioning software research into software practice in avionics and other domains has arisen for technology transitioning.



The expectation is that promising work funded by one organization early in the lifecycle, e.g., a DARPA program focusing on Science and Technology, will be perpetuated by another organization in the following phase, e.g., an AFRL program on Research & Engineering. The DoD has over 100 separate organizations engaged in technology transfer. The current strategy and the number of organizations involved makes it hard for knowledgeable people across the Technology Readiness Lifecycle - customers, researchers, engineers, and operators - to collaborate and plan for transition.



Program Engineers typically view new development tools and runtime platforms as risky due to:



• Insufficient evidence to prove their capabilities can payoff in production environments,



• Immature prototypes that lack stability, features, user support, training, and technology-to-tool chain integrations, and



• The absence of affordable and sustainable long-term commercial support plans. Software-Intensive Systems Producibility researchers typically assume others in government or commercial market place will address these problems.



It is the intent of Systems and Software Test Track to help address all of the above issues.



Objective:



The overall objective of this Systems and Software Test Track (SSTT) BAA program is to provide an open framework environment where an assortment of experimental tools and products may be deployed and allowed to interact in real-time, interactive, evolutionary and interdependent means, thereby allowing rigorous testing of new technologies, methodologies and theories in support of the Software-Intensive Systems Producibility Initiative (SISPI). The Systems and Software Test Track will facilitate testing of Software-Intensive Systems Producibility research products and methods, provide an environment for research of DoD embedded systems and software problems, provide an ability for university and industry leverage of technology development, and establish a capability for successful technology transition and transfer.



The SSTT is intended to be an open collaborative research and development environment to demonstrate, evaluate, and document the ability of novel tools, methods, techniques, and technologies to yield affordable and more predictable production of software intensive systems. The SSTT system will bring together researchers, developers, and domain experts from different communities to de-fragment the knowledge necessary to achieve SISPI research, development and technology transition.



The SSTT will be a system where SISPI researchers can test their research against relevant challenge problems, and where independent analysis of SISPI research can be preformed. This independent analysis would enable the SSTT to support acquisition program offices and analyze utility of tools. Also, SSTT users can bring their unsolved problems to provide challenges that drive SiSPI research where no such tools are available, or search for a solution by leveraging existing capabilities available in the SSTT.



The SSTT will have several logical roles of collaborating participants: Challenge Problem Provider (for providing challenge problems), Candidate Solution Provider (for providing candidate solutions to challenge problems), Experimenter (for running experiments and collecting experimental results), Collaborators (involved in collaborating on challenge problems, candidate solutions, and/or experiments), Solution Inquirer (actively searching for a needed capability) and SSTT Administrators (responsible for administering the SSTT infrastructure). Various community participants may play one or more of these logical roles. SSTT will enable Program Engineers from DoD Acquisition Programs to interact with SISPI Researchers from Industry Labs or Universities to define, discover and evaluate SISPI technology.



Phase 1 consisted of defining, developing, and documenting the Concept of Operations (CONOPS), a user-oriented document that describes system characteristics for a proposed system from the users' viewpoint; and defining, developing and documenting the architecture and the fundamental organization of the Systems and Software Test Track as embodied in its components, their relationships to each other and the environment, and the principles governing its design and evolution. Initial concepts along with the final CONOPS and architectures were presented to representatives from Government, Industry and Academia. Two awards were made and all documentation is available on the BCSW server (please contact steven.drager@rl.af.mil to request access). The CONOPS is posted on FedBizOpps along with this modification 1. Phase II is the environmental development and operations phase.



The environment is open and available for use by developers as well as independent analysis by the facility operators. This independent analysis allows the facility operators to be supportive to major defense acquisition program offices as well as analyzing the utility of tools. Program offices may bring their unsolved problems to the test track for either help in solving or by looking for needed utility amongst the tools available. Lastly, this synthetic environment should provide a place where big codes can be tested allowing requirements verification prior to production and deployment. This risk reduction affords the ability to verify and validate functionality of today's complex software intensive systems while providing a realistic environment for researchers to verify their tools against realistic problems.



The Systems and Software Test Track provides a place (possibly virtual and not a single physical location) for experimental verification of Software-Intensive Systems Producibility technologies due to their novelty and the potential complexity of the underlying theories. The experimental platforms incorporate software technology to instrument, monitor and test large-scale applications. Challenge problems for the open experimental platforms are accessible for all the research teams. The experimental platform research includes subtasks to conduct large-scale coordination experiments, and to develop methods and tools for evaluating aggregate performance of applications. This environment provides a full range of collaborative technology challenges, run-time platforms and applications, experiments, evaluations, and demonstrations. A Common infrastructure will enable control and data flow between both kinds of application components for a distributed environment. The open experimentation environment will provide the fundamental reference architecture and underpinnings helping researchers to develop and test their designs as well as facilitates transition of promising technologies into production use.



Research Concentration Areas:

The primary goal of Phase 2 is to research, design and implement the Systems and Software Test Track (SSTT) from the Concept of Operations (CONOPS). The SSTT will facilitate testing of Software-Intensive Systems Producibility research products and methods, provide an environment for research of DoD embedded systems and software problems, provide an ability for university and industry leverage of technology development, and establish a capability for successful technology transition and transfer. To ensure a broad range of collaborators offerors need to provide details on how researchers will collaborate with the SSTT.



The Architecture provides the ability to understand and control those elements of system design that capture the system’s utility, cost, and risk. These elements could be the physical components of the system and their relationships, the logical components or enduring principles or patterns that create enduring structures. The Architecture provides a rigorous definition of what constitutes the fundamental organization of the Systems and Software Test Track embodying all information regarding elemental relations, interactions and interfaces.



Areas to consider when developing Architecture components include, but are not limited to:



• The environment to host, execute and test technology, both hardware environment (run time platform) and software environment (run time platform)



• Details on how researchers will collaborate with the SSTT



• Interfaces and collaborations



• Mechanisms to support research and analysis of DoD problems including, but not limited to: software artifacts, benchmarks, executables, source code, design documents, requirements documents, examples, models, fault data, lessons learned, software construction files and tools



• Data repositories for results, success stories, benchmarks, quantitative and qualitative results, software disaster studies defining problems and basic research areas, ability to upload/download artifacts



• Measurement Techniques and Software Forensics



• Metrics including reduced development time; ease in which domain experts and software engineers can interact; ease in which different domain experts can specify and design code independently of one another; usability, i.e., the 'naturalness' of the modeling language from which code is generated; ability of test engineers to modify and tune code in the field; accurate automated documentation in design



• Mechanism for studying innovative systems and dynamic processes, not static snapshots



• Populate, demonstrate, promote and improve the SSTT. For the SSTT to be successful it must be populated with challenge problems, software producibility tools, collaboration tools, and administrative tools.



References: Institute for Electrical and Electronics Engineers, IEEE Guide for Information Technology-System Definition-Concept of Operations (CONOPS) Document. IEEE Std 1362-1998, IEEE Computer Society Press, 1998.



Anticipated Schedule:



Phase 2, the Development and Operations phase, is anticipated to be a 4 year effort. Phase 3, the Research and Operations phase, is expected to begin upon the completion of Phase 2. Additional information will be provided as a modification to this BAA to initiate Phase 3. This modification is anticipated to be issued in September 2009.



II. AWARD INFORMATION:



Total funding for this BAA is approximately $18M. The anticipated funding to be obligated under this BAA is broken out by fiscal year as follows: FY 06 - $1.0M; FY 07 - $1.0M; FY 08 - $6.1M; FY 09 - $4.9M; and FY10 - $5.0M. Individual awards will not normally exceed 6 months with dollar amounts ranging between $300K to $400K per year for Phase 1 and will not normally exceed 18 months with dollar amounts ranging between $700K and $3.0M per year for Phase II. Awards of efforts as a result of this announcement will be in the form of contracts, grants, cooperative agreements, or other transactions depending upon the nature of the work proposed.



III. ELIGIBILITY INFORMATION:



1. ELIGIBLE APPLICANTS: All potential applicants are eligible. Foreign allied participation is authorized at the prime contractor level. Foreign allied participation is allowed of the following countries: France, Germany, Greece, Israel, Italy, Luxembourg, Netherlands, Australia, Austria, Belgium, Canada, Denmark, Egypt, Finland, Norway, Portugal, Spain, Sweden, Switzerland, Turkey and United Kingdom.



2. COST SHARING OR MATCHING: Cost sharing is not a requirement.



IV. APPLICATION AND SUBMISSION INFORMATION:



1. APPLICATION PACKAGE: THIS ANNOUNCEMENT CONSTITUTES THE ONLY SOLICITATION. WE ARE SOLICITING WHITE PAPERS ONLY. DO NOT SUBMIT A FORMAL PROPOSAL AT THIS TIME. Those white papers found to be consistent with the intent of this BAA may be invited to submit a technical and cost proposal. See Section VI of this announcement for further details.



For additional information, a copy of the AFRL/Rome Research Sites "Broad Agency Announcement (BAA): A Guide for Industry," April 2007, may be accessed at: http://www.fbo.gov/spg/USAF/AFMC/AFRLRRS/Reference%2DNumber%2DBAAGUIDE/listing.html



2. CONTENT AND FORM OF SUBMISSION: Offerors are required to submit an electronic copy of a NOT to exceed 10 page white paper summarizing their proposed approach/solution. The purpose of the white paper is to preclude unwarranted effort on the part of an offeror whose proposed work is not of interest to the Government. The white paper will be formatted as follows: Page 1 - Section A: Title, Period of Performance, Estimated Cost of Task, Name/Address of Company, Technical and Contracting Points of Contact (phone, fax and email); Pages 2 to not more than 10 - Section B: Task Objective, and Section C: Technical Summary and Proposed Deliverables. Multiple white papers within the purview of this announcement may be submitted by each offeror. If the offeror wishes to restrict its white papers/proposals, they must be marked with the restrictive language stated in FAR 15.609(a) and (b). All white papers/proposals shall be double spaced with a font no smaller than 12 pitch. In addition, respondents are requested to provide their Commercial and Government Entity (CAGE) number, a fax number, and an e-mail address with their submission. All responses to this announcement must be addressed to the technical POC, as discussed in paragraph five of this section.



3. SUBMISSION DATES AND TIMES: It is recommended that white papers be received by the following dates to maximize the possibility of award: FY 07 by 31 May 2007; FY 08 by 31 December 2007; FY 09 by 31 December 2008 and, FY 10 by 31 December 2009. White papers will be accepted until 2pm Eastern time on 30 March 2010, but it is less likely that funding will be available in each respective fiscal year after the dates cited. Submission of white papers will be regulated in accordance with FAR 15.208.



4. FUNDING RESTRICTIONS: The cost of preparing white papers/proposals in response to this announcement is not considered an allowable direct charge to any resulting contract or any other contract, but may be an allowable expense to the normal bid and proposal indirect cost specified in FAR 31.205-18. Incurring pre-award costs for ASSISTANCE INSTRUMENTS ONLY, are regulated by the DoD Grant and Agreements Regulations (DODGARS).



5. OTHER SUBMISSION REQUIREMENTS: DO NOT send white papers to the Contracting Officer. All responses to this announcement must be addressed to:



Steven.Drager@rl.af.mil

Attn: Mr. Steven Drager



Respondents are required to provide their Dun & Bradstreet (D&B) Data Universal Numbering System (DUNS) number with their submittal and reference BAA 06-13-IFKA.



V. APPLICATION REVIEW INFORMATION:



1. CRITERIA: The following criteria, which are listed in descending order of importance, will be used to determine whether white papers and proposals submitted are consistent with the intent of this BAA and of interest to the Government:



(1) Overall Scientific and Technical Merit -- Including the following:



a) Clear definition and development of the technical concepts

b) Technical excellence, creativity and innovation of the development and experimentation approach for the SSTT architecture, interoperability, and the definition quality and quantity of proposed challenge problems.



c) Evidence of research acumen in integration and development of the SSTT and evidence of understanding the challenges that will be faced in the near, mid and far term.



(2) Related Experience - The extent to which the offeror demonstrates relevant technical knowledge and competency and necessary software technologies domain knowledge as well as the diversity and quality of the team (balance of skills set, excellence of academic-industry team members, and proven capability to conduct innovative practical research), and the strength of insight into program offices. Of particular importance is acumen in the unique challenges of integration and development confronting the SSTT in the near-to-far term;



(3) Maturity of Solution - The extent to which existing capabilities and standards are leveraged, the relative maturity of the proposed technology in terms of reliability and robustness, and previous demonstration of capability to accomplish the tasks.



(4) Collaboration - The extent to which the offeror demonstrates commitment and

detailed plans for collaboration among all interested and relevant parties in

the SSTT. How researchers will collaborate within the SSTT and plans for sharing results from the SSTT with the broader community and encouraging participation of team and non-team members alike in evaluations and information dissemination.



(5) Reasonableness and realism of proposed costs and fees (if any).



No further evaluation criteria will be used in selecting white papers/proposals. Individual white paper/proposal evaluations will be evaluated against the evaluation criteria without regard to other white papers and proposals submitted under this BAA. White papers and proposals submitted will be evaluated as they are received.



2. REVIEW AND SELECTION PROCESS: Only Government employees will evaluate the white papers/proposals for selection. The Air Force Research Laboratory's Information Directorate has contracted for various business and staff support services, some of which require contractors to obtain administrative access to proprietary information submitted by other contractors. Administrative access is defined as "handling or having physical control over information for the sole purpose of accomplishing the administrative functions specified in the administrative support contract, which do not require the review, reading, or comprehension of the content of the information on the part of non-technical professionals assigned to accomplish the specified administrative tasks." These contractors have signed general non-disclosure agreements and organizational conflict of interest statements. The required administrative access will be granted to non-technical professionals. Examples of the administrative tasks performed include: a. Assembling and organizing information for R&D case files; b. Accessing library files for use by government personnel; and c. Handling and administration of proposals, contracts, contract funding and queries. Any objection to administrative access must be in writing to the Contracting Officer and shall include a detailed statement of the basis for the objection.



VI. AWARD ADMINISTRATION INFORMATION:



1. AWARD NOTICES: Those white papers found to be consistent with the intent of this BAA may be invited to submit a technical and cost proposal. Notification by email will be sent by the technical POC. Such invitation does not assure that the submitting organization will be awarded a contract. Those white papers not selected to submit a proposal will be notified in the same manner. Prospective offerors are advised that only Contracting Officers are legally authorized to commit the Government.



All offerors submitting white papers will be contacted by the technical POC, referenced in Section VII of this announcement. Offerors can email the technical POC for status of their white paper/proposal no earlier than 45 days after proposal submission.



2. ADMINISTRATIVE AND NATIONAL POLICY REQUIREMENTS:



Depending on the work to be performed, the offeror may require a TOP SECRET facility clearance and safeguarding capability; therefore, personnel identified for assignment to a classified effort must be cleared for access to TOP SECRET information at the time of award. In addition, the offeror may be required to have, or have access to, a certified and Government-approved facility to support work under this BAA. Data subject to export control constraints may be involved and only firms holding certification under the US/Canada Joint Certification Program (JCP) (www.dlis.dla.mil/jcp) are allowed access to such data.



The technical data rights proposed must be consistent with the requirements that the Software and Systems Test Track is a resource available to the community at large. The interface specifications will be made available to all parties having their requests for access granted by the Government.



3. REPORTING: Once a proposal has been selected for award, offeror’s will be required to submit their reporting requirement through one of our web-based, reporting systems known as JIFFY or TFIMS. Prior to award, the offeror will be notified which reporting system they are to use, and will be given complete instructions regarding its use.



VII. AGENCY CONTACTS:



Questions of a technical nature shall be directed to the cognizant technical point of contact, as specified below:



TPOC Name: Steven Drager

Telephone: (315) 330-2735

Email: Steven.Drager@rl.af.mil



Questions of a contractual/business nature shall be directed to the cognizant contracting officer, as specified below:



Lynn White

Telephone (315) 330-4996

Email: Lynn.White@rl.af.mil



The email must reference the solicitation (BAA) number and title of the acquisition.



In accordance with AFFARS 5315.90, an Ombudsman has been appointed to hear and facilitate the resolution of concerns from offerors, potential offerors, and others for this acquisition announcement. Before consulting with an ombudsman, interested parties must first address their concerns, issues, disagreements, and/or recommendations to the contracting officer for resolution. AFFARS Clause 5352.201-9101 Ombudsman (Aug 2005) will be incorporated into all contracts awarded under this BAA. The AFRL Ombudsman is as follows:



Susan Hunter

Building 15, Room 225

1864 Fourth Street

Wright-Patterson AFB OH 45433-7130

FAX: (937) 225-5036; Comm: (937) 255-7754



All responsible organizations may submit a white paper which shall be considered.



:
Department of the Air Force, Air Force Materiel Command, AFRL - Rome Research Site, AFRL/Information Directorate 26 Electronic Parkway, Rome, NY, 13441-4514
:
Lynn G. White,
Contract Specialist
Phone: (315) 330-4996
Fax: (315) 330-8120