Random element from an array bigger than 32767 in bash -
having:
mapfile -t words < <( head -10000 /usr/share/dict/words) echo "${#words[@]}" #10000 r=$(( $random % ${#words[@]} )) echo "$r ${words[$r]}"
this select random word array of 10k words.
but having array bigger 32767 (e.g. whole file 200k+ words), stops work because $random
32767. man bash
:
each time parameter referenced, random integer between 0 , 32767 generated.
mapfile -t words < /usr/share/dict/words echo "${#words[@]}" # 235886 r=$(( $random % ${#words[@]} )) #how change this? echo "$r ${words[$r]}"
don't want use perl perl -ple 's/.*/int(rand()*$_)/e'
, not every system have perl installed. looking simplest possible solution - , don't care true randomness - isn't cryptography. :)
one possible solution maths outcome of $random
:
big_random=`expr $random \* 32767 + $random`
another use $random
once pick block of input file, $random
again pick line within block.
note $random
doesn't allow specify range. %
gives non-uniform result. further discussion at: how generate random number in bash?
as aside, doesn't seem particularly wise read whole of words
memory. unless you'll doing lot of repeat access data structure, consider trying without slurping whole file @ once.
Comments
Post a Comment