The saudi government is needing to use the rhetoric of "fighting terrorism" to allow women to own their own businesses. They're hoping this will help improve their economy and thus help deal with their massive unemployment (from whence all those terrorists are spawned).
It is of course opposed by conservative religious groups (much like every other social advancement).
I wonder if there exist any large religions that are women positive? The only one I can think of is wicca, other than that any religiously approved positive treatment of women has come from women demanding better treatment.