[Spark2014-discuss] [M503-001] SPARK 2014: proof rules for type invariants

Steve Baird baird at adacore.com
Fri May 3 22:46:15 CEST 2013


On 05/03/2013 12:51 AM, Yannick Moy wrote:
>> I think we could get into some complicated messes with
>> the LSP requirements we impose for overriding subprograms
>> if we have different implicit contracts for a subprogram
>> depending on where it is declared, but I doubt this nasty
>> interaction would come up in practice.
>
> I'll have to see an example.
>

We have a tagged type T1 which is not private in any way;
the declaration of said T1 (of course) makes no mention of
any type_invariant.

We also have a primitive procedure Foo for this type which takes
an in out parameter of the tagged type. It has no explicit
precondition, so it gets an implicit precondition of True.

Next, we have a private extension (I think it could be a
private type, but this way is simpler) T2
with a type_invariant which is implemented
as an extension of T1.

T2 has its own overriding Foo procedure explicitly declared
(and visible).

This overriding Foo is a "boundary" subprogram, so it gets the
"implicit" precondition described explicitly in the Spark RM.

Unfortunately, the original (overridden) Foo had no such
precondition and we get an LSP violation. The precondition
of the overridden (i.e. True) does not imply the precondition
of the overrider (i.e., the type invariant).

This is not a problem for subtype validity. You can think
of the "all parts satisfy their subtype" precondition
for inputs as being like a dispatching predicate in the
case of a tagged type, so that an overrider has the
same precondition as the overridden even if the
"all parts" check applies to more parts in the case
of an extension. Thus, no LSP problems.

So if we could get type invariants to work similarly
to subtypes in this respect, I think this problem
might go away.

Does that make things clearer?

   -- Steve






More information about the Spark2014-discuss mailing list