# Sequence Juggler in python

## Question:

I have to create a program that asks the user for natural numbers or `0` to finish and for each number that the user enters, the program must create a juggling sequence and display them on the screen.

In turn, when the program ends, it should show on the screen which was the number that generated the longest and the shortest sequence.

This is what the sequence does:

if number is even => number = number^(1/2)
if number is odd => number = number^(3/2)

And the sequence ends when number = 1

Here the code:

``````numero = int(input('Ingrese un numero natural o 0 para terminar: '))

while numero != 0:
num = numero

while num != 1:
if num % 2 == 0 :
num = int(num ** 0.5)
print (num,end=' ')
else :
num =int(num ** 1.5)
print(num,end=' ')

maximo = numero
minimo = numero
print()
numero = int(input('Ingrese un numero natural o 0 para terminar: '))
print('La sucesion mas larga se genero con el numero:',maximo)
print('La sucesion mas corta se genero con el numero:',minimo)
``````

All this I must do only using `While, if, elif, else` and some other that appear there in the code.

As you will see, my problem occurs when I finish the program, since it tells me that the `minimo` variable is not defined, and before that error, the other was that, when it told me which number generated the longest and the shortest sequence, I showed the same number for both (the maximum).

• The value of `contador_min` should be initialized to an arbitrarily large number. As you have it, initialized to 1, it's never going to change that value since any sequence of "juggles" will be longer than 1. That's why the `minimo` variable is never assigned and gives you that error. To fix it make `contador_min=1000000` for example.
• Almost more importantly (and harder to see!) to increment the counter, instead of `contador += 1` you've put `contador =+ 1` , which is basically the same as `contador = +1` . That is, instead of incrementing the counter, you always assign the value 1!