In my 25 years as a contractor I do not know how many times I was asked this question. The answer is, no. In most cases (especially in non-industrial settings) voltage has zero effect on the cost of electricity used.
We are charged for electricity in kilowatt hours. A kilowatt hour is 1000 watts used over one hour of time. A watt is a measure of the electricity we have used, it is a constant and never changes (for the example of homes and most commercial buildings). We typically only use higher voltages to save money on copper wire. The reason is we measure wire size by amperage. Amperage is determined by wattage / voltage = amps, thus if we want to be able to use smaller (cheaper) wire, we must increase the voltage.
Knowing that wattage is constant and really measures the work performed, we can then know for a fact that in most circumstances (all that relate to homes), voltage will have NO affect on our power bill. Here is a simple example:
Let's say a hotwater heater is 5,000 watts. If we run it on 110 volts it will draw 45.46 amps. A circuit breaker is typically only rated at 80% of it's continuous load (the 5000 watts) so we would be required to install (by code) a 60 amp circuit. Knowing our wire tables a 60 amp circuit requires #4 wire which at the time of this writing was priced at $3.77 per foot. If we wire the same hot water heater at 220 volts we can use a 30 amp circuit and use #10 wire which is priced at 52 cents per foot.
Our wattage never changed but the cost of installation fell dramatically with the higher voltage. The hot wanter heater still uses 5,000 watts no matter what (or 5 kilowatts) which means every hour it is on, it uses 5 kilowatts of power. In my area, we pay around 7 cents per kilowatt so with this hot water heater we would pay about 35 cents per hour of use no matter the voltage.