FFI RC1 - What are you doing to Haskell?

Anthony Travers amt-public at dodo.com.au
Sun Feb 3 00:53:44 EST 2002


	I'm not sure if ffi at haskell.org is the appropriate place for an
	observation/opinion piece - it was the only mailing list mentioned
	on the FFI specs release page and in the specs themselves.
	If there's a more appropriate mailing list for this commentary,
	let me know.

	First some background info...

	Haskell has been my first PL of choice since April 2000 but have
	been swatting up on the language and related topics since April
	1999, so I have a general overview of most of the concepts behind
	the language and FP in general. "Amateur software research and
	development" is currently my main recreation (for better or
	worse!). My primary Haskell system is Hugs, with occasional
	but slowly increasing usage of HBI & HBC.

	Enough background...

	From what I've read, the FFI's approach to achieving
	interoperability looks like The Death Of One Thousand Cuts for
	Haskell. This is most clearly demonstrated by those laughable
	special identifiers (page 3) denoting specific call conventions
	which future Haskell implementations would have to be contorted
	into supporting. (what will be next? "chash","asm","intercal",
	"vb"..?)

	From http://www.haskell.org/ghc/overview.html:

		"We think that Haskell is a great language
		 to write applications in, and are dead keen for
		 GHC to be used for this purpose."

	If we all accept this statement for all mature Haskell compilers
	the we must hold our ground - rather than dropping down to the
	standards of other languages, those languages come up to the
	standard of Haskell.

	The mechanism to do this already exists in Haskell, but its use
	has hitherto been confined to Haskell - module imports.

	The interoperability challenge then becomes: how do I bundle up
	that C/C++/.NET/Java/etc library into a Haskell module?

	Encapsulated as Haskell modules, old libraries can then
	access and be accessed by code from other Haskell modules
	using the standard import/export mechanism.

	So instead of having Haskell implementations handle the calling
	conventions of C, C++, Java, .NET, etc, the Haskell calling
	convention is handled by the Haskell module encapsulators for
	C, C++, Java, .NET, etc, hence moving the complexity out
	of already-complicated Haskell implementations into separate
	libraries or modules.



	The problem of Haskell interoperability is reminiscent of the
	problem regarding numeric types during the original definition
	of Haskell: how to have overloaded arithmetic operators over
	separate numeric types (unlike Miranda(TM)) which could be used
	to define new overloaded functions (unlike Standard ML).

	In this case the problem becomes: how to enable foreign code
	access between Haskell and other languages without rewriting
	the Haskell-side access mechanisms for each language.

	Type classes were the adopted solution then - perhaps they are
	the solution now:

		data C		=  ....
		data CPlusPlus	=  ....
		data Java	=  ....
		data DotNET	=  ....
			:
			:

		class Foreign l where ....

		instance Foreign C where ....
		instance Foreign CPlusPlus where ....
		instance Foreign Java where ....
		instance Foreign DotNET where ....
			:
			:


	Anthony Travers (amt DASH public AT dodo DOT com DOT au)



More information about the FFI mailing list