To be honest, yes it should swerve. Shame about the cows, but unless they were Friesians they should have been on a pedestrian crossing. (Friesians are black and white, should be never anywhere near a pedestrian crossing).
Actually, the bias against hitting cows should be fairly strong. Don't get me wrong: I'm a bovivore, so I have no problem with killing cows, but animals of that size are themselves a danger to the occupants. Sure, if the choice is plowing into a brick wall or a herd of cattle at 60 mph, you go for the cows, but if it's a choice between a herd of cows and rear-ending an F-250 or a Hummer at 60, the occupants are probably safer if you hit the vehicle. The impact with the vehicle will be harder as such things tend to weigh more than cows, but the vehicle is less likely to come crashing down on your windshield.
But as for the broader moral problem of hitting pedestrians vs. causing the death of the occupants of the vehicle, I doubt that there are very many accidents where, given a human driver with decades of experience with human culture and human moral values:
1) There is little enough time to react that there is no third option that kills nobody.
2) There is enough time to react that both choices can be taken without losing control of the vehicle and having subsequent events be unpredictable.
and
3) There is enough time to react for a human to make a reasoned value judgement on moral principles rather than a reflex reaction on the spur of the moment without using up so much time that 2) is no longer true.
So in most cases with a *human driver*, there is not time for there to be a moral dilemma. , and most of the moral choices that will contribute to a given accident will have occured long before the event itself (e.g, "Am I going to make a habit of driving like a maniac?", "Am I going to get drunk tonight and then drive myself home?", etc.).
For a machine, which will not have the cultural resources and biological instincts that humans have to aid it in making a moral decision, there will be an even smaller proportion of cases in which there is any time for there to be a moral dilemma.
So I frankly think that, to the degree that we can provide it with the instrumentation to determine such things, self-driving vehicles should make decisions about what to hit based on physical characteristics of the collision, such as the mass of the object to be hit, the height of its center of gravity (if it's higher than the hood, it's more likely to come crashing down on the windshield and kill the front seat occupants), and how much maneuvering to avoid it would test the handling limits of the vehicle (if we lose control, depart the road sideways and roll over, there's a considerable danger to the occupants involved, and anybody by the side of the road).
"Minimize the amount of mass you make contact with" is, I think, a fairly good rule to start with. Mass is a good proxy for the severity of the accident for the occupants, the property value of any inanimate objects or animals you might strike, the size of any crowd of people, and the likelyhood of losing control and endangering the occupants and who knows who else after the impact. This won't produce a morally perfect choice every time, but I think it's a good heuristic. Now, it may not be realistically possible even to evaluate mass well enough to actually use this heuristic, but it's a whole lot more realistic than having your car evaluate human moral considerations in a situation that gives flesh-and-blood humans little enough time to do so.