To me, causality is less important than simple prediction. Determining cause is a philosophical question somewhat like asking if machines can truly think. If a system behaves as if it contained a causal process (ie. it is predictable), then whether or not one event truly causes another is irrelevant.