[Haskell-cafe] Can you do everything without shared-memory
concurrency?
Richard A. O'Keefe
ok at cs.otago.ac.nz
Mon Sep 8 22:21:10 EDT 2008
I think the demonstration is in Hoare's book on co-operating
sequential processes, but if you have pure processes and
message passing, you can simulate conventional variables.
Here's an Erlang version:
variable_loop(State) ->
receive
{ask,Sender} -> Sender!{self(),State}, variable_loop(State)
; {tell,New_State} -> variable_loop(New_State)
end.
new_variable(Initial_State) ->
spawn(fun () -> variable_loop(Initial_State) end).
fetch(Variable) ->
Variable!{ask,self()},
receive {Variable,State} -> State end.
store(Variable, New_State) ->
Variable!{tell,New_State}.
new_array(Size, Initial_State) ->
list_to_tuple([new_variable(Initial_State)
|| Dummy <- lists:seq(1, Size)]).
fetch(Array, Index) ->
fetch(element(Index, Array)).
store(Array, Index, New_State) ->
store(element(Index, Array), New_State).
The simulation of (shared) mutable variables and arrays in pure
CSP is very similar. This pretty much _has_ to be possible in
any language that has concurrent processes that can communicate
in some fashion. There are also quite elementary ways to
simulate locking. So we can solve any concurrent problem that,
say, C++ with Intel's TBB library, or Java with java.util.concurrent,
or any other such language can solve. We can even get ourselves
just as far into extremely serious trouble as we can using those
languages, it's just that we don't _have_ to.
--
If stupidity were a crime, who'd 'scape hanging?
More information about the Haskell-Cafe
mailing list