[Spark2014-discuss] [M122-037] specifying SPARK sources for analysis
moy at adacore.com
Thu Mar 14 15:24:06 CET 2013
On 03/14/2013 03:17 PM, Cyrille Comar wrote:
>> To me, two solutions make sense:
>> - an aspect on declarations
>> - a restricted on/off mechanism (for example, a specific pragma
>> SPARK_Policy) which appears either as configuration pragma, or at
>> library level in a package, with the intent that it applies to the
>> declarations that follow.
> but the latest is not an on/off mechanism. a config pragma applying to
> library-level unit is pretty much equivalent to an aspect on the
> corresponding package entity.
In some cases, the second solution could be more convenient, say if you
want to state that all declarations in the public part of a package are
in SPARK, but not all the declarations in the private part. With the
aspect, you will have to write "SPARK => False" on each declaration of
the private part. With the pragma, you just need a pragma SPARK (Off) at
the start of the private part.
>> I'm not happy currently with the "detect" mode in which GNATprove works.
>> I think users need a way to state exactly which parts of a unit should
>> be in SPARK (and should be analyzed by GNATprove), and which are not.
>> And I'd rather fix this in the coming weeks than at the end of the year.
> why aren't you happy? That will help making progress...
As a user of formal verification tools, I'd like to be able to state
which parts of a unit should be in SPARK and subject to formal analysis.
Being able to inspect a posteriori what the tool could handle or not is
> but we should not confuse
> 1. specifying that a given (subprogram) entity is intended to be in SPARK
> 2. specifying the scope of a given analysis.
> Sure we need a subprogram body to be in SPARK to be able to analyse it
> for proof but we don't necessarily wish to analyze all the time all
> entities that are in the SPARK subset.
> Are we talking of 1. or 2. in this TN? the subject implies 2. but
> everything that has been said makes me think that we mostly discuss 1.
We're talking 1. The wider scope for 2 is precisely the scope defined by
1. You're right that we may want to restrict further this scope, but
this we already provide at the tool level (the "Prove File" and "Prove
Yannick Moy, Senior Software Engineer, AdaCore
More information about the Spark2014-discuss