FFI RC1 - What are you doing to Haskell?
Manuel M. T. Chakravarty
chak at cse.unsw.edu.au
Mon Feb 4 23:57:52 EST 2002
"Anthony Travers" <amt-public at dodo.com.au> wrote,
> I'm not sure if ffi at haskell.org is the appropriate place for an
> observation/opinion piece - it was the only mailing list mentioned
> on the FFI specs release page and in the specs themselves.
> If there's a more appropriate mailing list for this commentary,
> let me know.
This list if fine. haskell at haskell.org would also have been
ok.
> This is most clearly demonstrated by those laughable
> special identifiers (page 3) denoting specific call conventions
> which future Haskell implementations would have to be contorted
> into supporting. (what will be next? "chash","asm","intercal",
> "vb"..?)
I would say "asm" and "stdcall" are about the same ;-)
I think, I don't really understand what you think the
problem with the specification of calling conventions is.
The only calling convention that a system must supported is
"ccall". Only *if* a system supports another calling
convention for which a standard exists, *then* it must use
the standard for this implementation. If not calling
convention, but ccall is supported, that's perfectly fine.
> From http://www.haskell.org/ghc/overview.html:
>
> "We think that Haskell is a great language
> to write applications in, and are dead keen for
> GHC to be used for this purpose."
>
> If we all accept this statement for all mature Haskell compilers
> the we must hold our ground - rather than dropping down to the
> standards of other languages, those languages come up to the
> standard of Haskell.
Applications need libraries. A (relatively) new language
has generally fewer libraries than more established
languages. So, the easiest way to get more libraries is to
be able to directly use libraries written in other
languages.
> The mechanism to do this already exists in Haskell, but its use
> has hitherto been confined to Haskell - module imports.
>
> The interoperability challenge then becomes: how do I bundle up
> that C/C++/.NET/Java/etc library into a Haskell module?
>
> Encapsulated as Haskell modules, old libraries can then
> access and be accessed by code from other Haskell modules
> using the standard import/export mechanism.
>
> So instead of having Haskell implementations handle the calling
> conventions of C, C++, Java, .NET, etc, the Haskell calling
> convention is handled by the Haskell module encapsulators for
> C, C++, Java, .NET, etc, hence moving the complexity out
> of already-complicated Haskell implementations into separate
> libraries or modules.
I don't really see how this changes the complexity in any
significant way. There is not a single Haskell calling
convention. Every system has its own and in fact an
optimising compiler may use different conventions for
different functions in a single program. So, each of
these encapsulators would be system-specific and, in the
end, be implemented by the same people who implemented the
system that it matches with.
Moreover, retrofitting a Haskell interface on, say, a C
library is a non-trivial task, which depends on the library
that's being dressed up. In what language do you express
this? In C? In a new language? We think, we'd rather do
it in Haskell itself. But to do so, we must be able to
denote a call to a C function in Haskell, which immediately
brings us to foreign import declarations.
> In this case the problem becomes: how to enable foreign code
> access between Haskell and other languages without rewriting
> the Haskell-side access mechanisms for each language.
>
> Type classes were the adopted solution then - perhaps they are
> the solution now:
>
> data C = ....
> data CPlusPlus = ....
> data Java = ....
> data DotNET = ....
> :
> :
>
> class Foreign l where ....
>
> instance Foreign C where ....
> instance Foreign CPlusPlus where ....
> instance Foreign Java where ....
> instance Foreign DotNET where ....
> :
> :
And what would the member functions in these classes be?
Cheers,
Manuel
More information about the FFI
mailing list