[Spark2014-discuss] [M122-037] specifying SPARK sources for analysis

Yannick Moy moy at adacore.com
Thu Mar 14 16:22:49 CET 2013

On 03/14/2013 04:08 PM, Tucker Taft wrote:

> But I think trying to make these one mechanism is what is creating the
> problem.  I think they may be fundamentally different issues, and
> saying something is either "in SPARK" or "not in SPARK" is apparently
> too "binary" for some people.  They would rather indicate what sort
> of a set of restrictions they want to impose (currently dubbed a "code
> profile"), which might take them below the "SPARK 2014" line.

There are two issues above:
- the boundary between SPARK and full Ada code. Even if being in SPARK 
does not guarantee the code is provable, it makes it formally analyzable 
by the SPARK toolset.
- the coding standard you apply. I think the long review that Arno 
posted on M226-014 shows this can be handled separately.

I'm mostly interested in the first one above.

> One of the problems is that exactly what is the SPARK 2014
> line may vary from release to release of the tool.
> It will not be good if the meaning of these pragmas changes
> over time.

It won't. What is supported may change, but not what is in SPARK 2014.

> I think we want pragmas connected to specific Ada features.

I think the coding standard is a separate issue. Arno has suggested ways 
for supporting it that do not require new mechanisms.

> At any given time, the tool may have trouble proving code that uses
> certain features, but later it may be able to handle it.  You don't
> want to have to re-annotate your code just because the tool has
> gotten smarter.

I fully agree. I have no problem with GNATprove initially bailing out 
with a "not supported" warning on SPARK code that it does not support.

>> We can always add more control on the tool side, if needed, possibly
>> with patterns as you
>> suggest. These may not be in the code, and in any case they are not
>> part of SPARK 2014.
> I don't think we can delay this, at least from a design point of
> view, because the control of the tool and the control of which
> Ada features are permitted in a given piece of source text, are
> not really the same thing, and trying to merge them into one
> specification seems to be creating a problem.

I have a third control in mind, which is simpler and more essential than 
the tool control or feature control: the SPARK/not SPARK boundary. Tool 
control can be delayed if we agree the tool initially deals with SPARK 
code only (possibly notifying the user when it cannot handle some 
features yet).
Yannick Moy, Senior Software Engineer, AdaCore

More information about the Spark2014-discuss mailing list