For the benefit of other readers I feel I have to point out that this is wrong. You have incorrectly applied Ohm's Law. Wattage is power, not resistance (or in the AC world, impedance). The unit of resistance/impedance is Ohms, not Watts, and therefore your calculation is wrong.
I'll show how this should properly be applied by describing a simply experiment you can try. Grab a multimeter and measure the resistance of a Par Can fixture. Since we need to measure resistance, using a purely resistive load such as a Par Can is best. Once we introduce units that have transformers things get much more complicated and it is far more difficult to use Ohm's Law. Anyways, performing this experiment myself I measured a cold resistance of 2.2Ohm and a hot resistance of 29.85Ohm on one of my American DJ 500w Par 64 fixtures. According to my AC volt meter, the loaded voltage of my circuit was 121.8v. Now I can apply Ohm's Law:
V = IR where V = 121.8v and R = 29.85ohm
Dividing V by R, I have an amperage of 4.08A
I can take this a step further by saying Wattage = Volts X Amps.
Multiplying 121.8v by 4.08A gives me a wattage of 497W... Sounds about right for a 500W lamp.
So to get back to the topic of this thread, how many Par 64s can I put on a circuit? As has been pointed out, a 20A circuit at 120v will handle 2400W. Therefore, I could theoretically put (2400/497 = 4.82) Par 64s on this circuit. Since I won't elaborate on the effects of adding additional, smaller fixtures or discussing if/when a standard breaker will actually pop, I'd limit this circuit to 4 Par 64s. Typically to be safe it's best not to exceed 80% (continuous) of the rated wattage of any given circuit. Sure, you could load 8 or 12 Pars onto a 20A circuit and just be careful not to turn on too many at one time, but in general that's considered bad practice. The same goes for putting 8 Par 64s on a 20A circuit but running them at a reduced level.
The best way I have personally found to determine how much amperage I'll pull is to actually measure it. In a pinch, just look at what the fuse/breaker on each fixture is rated for, but this will generally be too conservative to be useful (i.e. a unit has a 10A breaker but only pulls about 5A). There are many volt/amp/watt meters on the market - some that cost less than $20. At some point before your next job just get out your lights and measure how much current each one will pull. Then just be sure not to put more than 15 or 20 amps worth of lights on any single circuit (depending on the rating of the breaker) and you'll be good to go. Use quality electrical cable that's of sufficient gauge/type to minimize the drop in voltage you'll see on a longer run that's pulling high continuous amperage. Also note that halogen lamps tend to 'surge' a little when you turn them on due to their lower 'cold' resistance, discharge lamps do the same thing when striking but for a different reason, and strobe lights seem to be an exception to everything I've just said. I've never been able to get a 'good' amperage reading on a xenon strobe that I trust, so I just go by trial and error with these when testing on my own time.
Hope this helps clear things up!