My New 600w Isent Bright

Discussion in 'Growing Marijuana Indoors' started by blood_on_snow, Jan 29, 2008.

  1. I recently switched out my 250w magnetic ballast with a 600w future-brite ballast. The bulb doesn't seem to be much brighter than the 250w which is kinda weird. Also The temperature doesn't seem to be that much higher which is surprising.

    The bulb I'm using says it's 220v 600w, and my ballast is 120v 600w. Could that be the problem, or am I just too blind to see the difference in luminosity?
     
  2. The light should be the same, it will just cover a larger area; I just got a 600w kit and it says it can cover a 6x6 space.
    I don't think it matters what the bulb says, what I have heard is that by using the 220v current you end up using less energy that you would on a 120v.
     
  3. I'm no electrician but it seems that if you put a 220V bulb in a 120V socket it will only burn half as bright. Just like using a 6v battery on a 12v bulb = dim.
     
  4. seedling you seem to be a little new to this so let me help you out. When people say that a light will cover a given area they mean that because the bulb puts out enough light to assure that every plant in that given area will recieve atleast the minimum amount of light needed for optimal plant growth. The higher the wattage, the more luminosity, the further away from the bulb before that luminosity drops too low for the plant to grow. Luminosity is how bright the bulb is, this can also be measured in fc(foot candles). Here's a link for you... http://www.angelfire.com/cantina/fourtwenty/yor/lightres.htm

    oldskool thats exactly what I was thinking, I even looked it up and read something that supports your theory, but I wanted to hear from other growers that if I use the wrong voltage bulb then I will loose light.
     
  5. I have not found a definitive answer but this addresses the inverse of running a 240V bulb on 120 Volt line.

    From WikiAnswers:
    Let's look at a simple resistive circuit. The amount of current flowing through the circuit is proportional to the voltage divided by resistance: V = IR. R will remain constant, as it is a trait of the load. So, as the voltage doubles, the current flowing through it doubles. Each part of a system dissipates power. Power is the product of voltage and current. Since P = IV and V = IR, P = RI^2. Thus if the current flowing through a component doubles, the amount of power dissipated by that component quadruples!
    So let's say you have a 120W light bulb. This bulb has 120 ohms of resistance and normally draws 1A at 120V. Now let's say that you hook this bulb up to 240V. Now it is drawing 2A, and is dissipating 480W. Since the bulb is not designed to dissipate that much energy, the weak points burn open.
    So, when you connect a load to a voltage outside of it's rated input, you increase the energy dissipated by it to a value outside of it's operating range. This causes component failure in the form of "burning out."
     
  6. So according to that, what I'm getting is like 150 watts?
     
  7. That's what I'm thinking. Can you exchange the light/ballast for one that uses the correct voltage? Otherwise, you might need to run a 220V service to the room you're using your light in.

     
  8. I talked to someone recently who sells hps, here's what he said...

    Does the voltage of your lamp not matter? I was under the impression that a bulb has to match the voltage with their ballast to work properly.
    -blood_on_snow

    Lamps need to match the wattage, not the voltage of the ballast. Ballast
    may be wired for several voltages though. So a 600w lamp need a 600w
    ballast to run it.
    -Merchant
     

Share This Page