AdaCore Blog

SPARK 2014 Rationale: Mixing SPARK and Ada Code

by Yannick Moy

The first step before any formal verification work with SPARK is to delimitate the part of the code that will be subject to formal verification (the code in SPARK) within the overall Ada application (which could also contain parts coded in C, in Java, in assembly, etc.).

The possibility of easily linking Ada code with code in other programming languages (C in particular) has been one of the landmark features of Ada since the start, with an Annex of the Ada Reference Manual dedicated to such interfacing. As in many programming languages for embedded applications, Ada also offers the capacity to call directly assembly instructions within the program.

None of these models is suitable for interfacing Ada code with SPARK code:

  • SPARK being a subset of Ada, it would be overly restrictive to limit the interface to link-time combination of separate units written fully in Ada or fully in SPARK;
  • SPARK being used for formal verification, it would be overly permissive to allow freely mixing of Ada and SPARK code, without clear boundaries.

The solution we've come up with for SPARK 2014 is to let the user define those parts of the code that are in SPARK, using a special aspect or pragma SPARK_Mode. The rest of the code is allowed to use Ada features that are not in SPARK. For example, assume I have a unit with a core service in SPARK, called Compute, and logging and display services in Ada. I can describe this as follows:

package Services is
    procedure Compute with SPARK_Mode;
    procedure Log;
    procedure Display;
  end Services;

I can still call the SPARK and Ada procedures freely from each other, for example:

package body Services is
    procedure Compute with SPARK_Mode is
       -- do something
    end Compute;
    procedure Log is ...
    procedure Display is
       --  display values
    end Display;
  end Services;

Because procedures in SPARK and in Ada are clearly identified, formal verification can be applied to the first and usual verification based on testing to the second. Combining these results is possible by using subprogram contracts.

What is important to be able to formally analyze Compute above is that the procedure Log has a signature that is compatible with SPARK restrictions, and that it declares in a subprogram contract any constraint for calling it (the precondition) and any effect it has on its environment (the global annotation), although tool GNATprove automatically generates a safe approximation of the global annotation if the user does not give one.

If a unit is mostly in SPARK, it can be marked itself in SPARK, and individual subprograms in the unit can opt out of SPARK, for example:

 package Services with SPARK_Mode is
    procedure Compute;
    procedure Log with SPARK_Mode => Off;
    procedure Display with SPARK_Mode => Off;
  end Services;

A spec (subprogram or package) can be in SPARK and not its body, which it typical of features that will be called from SPARK code, but which are not themselves implemented in SPARK. Likewise, a package public part can be in SPARK, but not its private part, which is expressed as follows:

 package Services with SPARK_Mode is
    --  SPARK interface
    pragma SPARK_Mode (Off);
    --  implementation in Ada
  end Services;

Finally, entities that are neither marked in SPARK or not in SPARK may be used in SPARK code, as far as their declaration does not violate SPARK rules. This greatly facilitates using other units from SPARK code, as the code used needs not be marked with SPARK_Mode aspect or pragma. For example, this is the case for the Ada standard library: many subprograms of the standard library have a declaration compatible with SPARK, but they are not currently marked in SPARK; they can nonetheless be called from SPARK code (for example, to do I/O).

Posted in #Language    #Formal Verification    #SPARK   

About Yannick Moy

Yannick Moy is a Senior Software Engineer at AdaCore and co-director of the ProofInUse joint laboratory. At AdaCore, he works on software source code analyzers CodePeer and SPARK, aiming either at detecting bugs or at verifying safety/security properties. Yannick leads the developments of SPARK, a product he presents in articles, conferences, classes and blogs (in particular Yannick previously worked on source code analyzers for PolySpace (now The MathWorks) and at Université Paris-Sud.