Short answer: No.
Long answer:
An ideal transmitter is matched to a 50 ohm load. This means that the radio is expecting the antenna to present a 50 ohm impedance.
Let's be honest here, CBs aren't exactly renowned for being high quality pieces of precision equipment. So chances are the SWR will be slightly different after changing radios.
Your SWR varies with frequency. Unless there's one channel (frequency) that you know you're going to be using exclusively, generally you tune for the lowest possible SWR in the middle of the band. In the case of CB, this will be channel 20. You'll have the lowest SWR right there and it will be higher closer to channels 1 and 40.
However, it's probably not going to make a whole lot of difference. The spread between channels 1 and 40 is only 440 kHz so the SWR difference from one end of the band to the other is probably going to be less than the error range of your SWR meter.
Assuming you're running the legal limit of 4W AM/12W SSB unless your antenna is WAY out of adjustment, you probably won't notice any difference after switching radios. The difference of 1.0:1 and 2.0:1 is .5 dB, which at CB limits is 4 watts vs 3.6 watts. 400 milliwatts is not a noticeable difference. Likewise, at 12 watts with a 2:1 SWR you're losing 1.2 watts - well below anything that's going to make any difference.
Most CBs should have a foldback circuit that will prevent any damage to the PA. Unless you're running an amplifier, just being in the ballpark is fine.