Forum Home Forum Home > Topics not related to music > General discussions
  New Posts New Posts RSS Feed - Sci Fi TV science or fiction?
  FAQ FAQ  Forum Search   Events   Register Register  Login Login

Topic ClosedSci Fi TV science or fiction?

 Post Reply Post Reply Page  <1 1314151617 23>
Author
Message
Gerinski View Drop Down
Prog Reviewer
Prog Reviewer
Avatar

Joined: February 10 2010
Location: Barcelona Spain
Status: Offline
Points: 5154
Direct Link To This Post Posted: August 07 2013 at 07:35
Originally posted by Dean Dean wrote:

Well... duh.
 
Actually, your reply is a little disapointing to be honest - retention of the incremental changes is not necessary so they are not accumulated information and so do not contribute to the total information required - it is not data compression because once you know t(n+1) then t(n-1) is redundant - the information content remains constant from the initial conditions through to the final result.
 
 
Easy to say in retrospect, they didn't know that. But you seem to know that the description of a human is uncompressible even in principle, good for you. Even Maxwell had to learn how to suck an egg when Einstein proved that light is quantized.
Back to Top
Dean View Drop Down
Special Collaborator
Special Collaborator
Avatar
Retired Admin and Amateur Layabout

Joined: May 13 2007
Location: Europe
Status: Offline
Points: 37575
Direct Link To This Post Posted: August 07 2013 at 08:21
It was retrospect that created formula from a small array of observational data that was then used to create vast tables of computed data - but unless you know where (and when) to start the formulae are useless, and without the means to process the data the formulae and the data are useless. Vast tables of data were used in the 18th century because we did not have computers not because we did not have the formula, they were not created from observation. The formulae were used to create the tables by hand (human computers) - because this was a lengthy (and error prone) task, they made tables so people people didn't have to re-do the calculation every time. Charles Babbage was funded by the British government for 10 years so he could build "an engine" that would calculate these tables without human error - his Difference and Analytical Engines applied forumula to a set of known conditions t(n) to calculate the next value t(n+1) - this is the method of finite differences, the same method Newton used for his interpolation formula.
btw: James Clerk Maxwell had been dead some 30 years by the Einstein's time. Had he lived he would have understood quantum physics well enough.
 
 
 
I never said the data is uncompressible. Read what I have written and stop inventing what you think I wrote.
 
Data compression is obviously possible and simple "deflate" type compression techniques would reduce the amount fo data by a magnitude but that is not enough to provide any practical gains for encoding a human bean for transmission as data. Yes, in the future we will come up with some improved compression techniques that could raise the current compression ratios to several orders of magnitude, but that still will not be enough.
What?
Back to Top
Gerinski View Drop Down
Prog Reviewer
Prog Reviewer
Avatar

Joined: February 10 2010
Location: Barcelona Spain
Status: Offline
Points: 5154
Direct Link To This Post Posted: August 07 2013 at 09:47
Originally posted by Dean Dean wrote:

It was retrospect that created formula from a small array of observational data that was then used to create vast tables of computed data
That the amount of observational data required to find out the regularities that could eventually be represented by a formula was "a small array" (in the early scientific period of the 16th and 17th century) is debatable. I have always understood that Kepler could only come with his laws of planetary motion thanks to having access to the vast (for the period at least) amounts of observational data collected by Tycho Brahe.
 
I know Maxwell was dead by then, I only meant that he would likely have thought it (at least at first) incompatible with his wave theory, remember that at that time the debate was "wave or particle" and it took quite some debate and experimental evidence to accept that a duality of both was the actual solution. I believe it was none other than Max Planck who as late as 1913 said about Einstein's quantum hypotesis that "That he may sometimes have missed the targeting his speculations, as for example, in his hypothesis of light-quanta, cannot really be held too much against him". Enough said I guess, what one day looks plainly silly to one of the most educated men on Earth may be proven correct not too many years later and that most educated man may need to learn a new way of sucking an egg.
Back to Top
Dean View Drop Down
Special Collaborator
Special Collaborator
Avatar
Retired Admin and Amateur Layabout

Joined: May 13 2007
Location: Europe
Status: Offline
Points: 37575
Direct Link To This Post Posted: August 07 2013 at 09:56
Originally posted by Gerinski Gerinski wrote:

Originally posted by Dean Dean wrote:

It was retrospect that created formula from a small array of observational data that was then used to create vast tables of computed data
That the amount of observational data required to find out the regularities that could eventually be represented by a formula was "a small array" (in the early scientific period of the 16th and 17th century) is debatable. I have always understood that Kepler could only come with his laws of planetary motion thanks to having access to the vast (for the period at least) amounts of observational data collected by Tycho Brahe.
Missed the point by a country mile, but nevermind, you carry on.
 
Originally posted by Gerinski Gerinski wrote:

 
I know Maxwell was dead by then, I only meant that he would likely have thought it (at least at first) incompatible with his wave theory, remember that at that time the debate was "wave or particle" and it took quite some debate and experimental evidence to accept that a duality of both was the actual solution. I believe it was none other than Max Planck who as late as 1913 said about Einstein's quantum hypotesis that "That he may sometimes have missed the targeting his speculations, as for example, in his hypothesis of light-quanta, cannot really be held too much against him". Enough said I guess, what one day looks plainly silly to one of the most educated men on Earth may be proven correct not too many years later and that most educated man may need to learn a new way of sucking an egg.
A point not so much missed as flown off on a tangent. Hey-ho.
 
 
 
 
 
 
 
 
 
[for future reference the idiom "teaching granny to suck eggs" would mean that Einstein would be attempting to teach Maxwell something he already knew - that is, that light was a electromagentic wave for example. It precisely does not mean he would have been teaching him something new - that is "teaching an old dog a new trick" - a different idiom with a compeltely different meaning]


Edited by Dean - August 07 2013 at 10:25
What?
Back to Top
Gerinski View Drop Down
Prog Reviewer
Prog Reviewer
Avatar

Joined: February 10 2010
Location: Barcelona Spain
Status: Offline
Points: 5154
Direct Link To This Post Posted: August 07 2013 at 10:25
Originally posted by Dean Dean wrote:

A point not so much missed as flown off on a tangent. Hey-ho.
Why so? You just predicted that it will never be practically feasible to encode the information required to describe a human and transmit that amount of information in any way which can be considered reasonably feasible.
 
Auguste Compte 1842: "We can never learn the stars internal constitution, nor, in regard to some of them, how heat is absorbed by their atmosphere. We can never know anything of the planets chemical or mineralogical structure; and, much less, that of organized beings living on their surface."
 
Lord Kelvin 1895: "Heavier-than-air flying machines are impossible."
 
Albert Einstein 1934: "There is not the slightest indication that nuclear energy will ever be obtainable. It would mean that the atom would have to be shattered at will."
 
Black holes were considered impossible when light was believed (correctly) to be massles, so it could not be affected by gravity (incorrectly).
 
Back to Top
Gerinski View Drop Down
Prog Reviewer
Prog Reviewer
Avatar

Joined: February 10 2010
Location: Barcelona Spain
Status: Offline
Points: 5154
Direct Link To This Post Posted: August 07 2013 at 10:31
Originally posted by Dean Dean wrote:

Originally posted by Gerinski Gerinski wrote:

That the amount of observational data required to find out the regularities that could eventually be represented by a formula was "a small array" (in the early scientific period of the 16th and 17th century) is debatable. I have always understood that Kepler could only come with his laws of planetary motion thanks to having access to the vast (for the period at least) amounts of observational data collected by Tycho Brahe.
Missed the point by a country mile, but nevermind, you carry on.
Sorry for my being so obtuse and carrying on, but can you please explain again why is it incorrect to say that Kepler found a way to compress the large amounts of data collected by Tycho Brahe? I believe that after his findings, a lot of data was found to be redundant and could be thrown away without losing its informational value. The same amount of data could be reproduced with less bits of information required to store it. Why is that not compression? 
Back to Top
Dean View Drop Down
Special Collaborator
Special Collaborator
Avatar
Retired Admin and Amateur Layabout

Joined: May 13 2007
Location: Europe
Status: Offline
Points: 37575
Direct Link To This Post Posted: August 07 2013 at 10:42
Originally posted by Gerinski Gerinski wrote:

Originally posted by Dean Dean wrote:

A point not so much missed as flown off on a tangent. Hey-ho.
Why so? You just predicted that it will never be practically feasible to encode the information required to describe a human and transmit that amount of information in any way which can be considered reasonably feasible.
 
Auguste Compte 1842: "We can never learn the stars internal constitution, nor, in regard to some of them, how heat is absorbed by their atmosphere. We can never know anything of the planets chemical or mineralogical structure; and, much less, that of organized beings living on their surface."
 
Lord Kelvin 1895: "Heavier-than-air flying machines are impossible."
 
Albert Einstein 1934: "There is not the slightest indication that nuclear energy will ever be obtainable. It would mean that the atom would have to be shattered at will."
 
Black holes were considered impossible when light was believed (correctly) to be massles, so it could not be affected by gravity (incorrectly).
 
Yes clever men make assumptions that later prove to be incorrect - that goes for you as well as it does me. That is not what I was talking about when I made the sucking-eggs jape.
 
Hence my added footnote:
 
Originally posted by Dean Dean wrote:

[for future reference the idiom "teaching granny to suck eggs" would mean that Einstein would be attempting to teach Maxwell something he already knew - that is, that light was a electromagentic wave for example. It precisely does not mean he would have been teaching him something new - that is "teaching an old dog a new trick" - a different idiom with a compeltely different meaning]
 
 
What?
Back to Top
Gerinski View Drop Down
Prog Reviewer
Prog Reviewer
Avatar

Joined: February 10 2010
Location: Barcelona Spain
Status: Offline
Points: 5154
Direct Link To This Post Posted: August 07 2013 at 10:52
Originally posted by Dean Dean wrote:

Yes clever men make assumptions that later prove to be incorrect - that goes for you as well as it does me.
The difference between you and me is that I acknowledge that, but you seem to know what will be possible and what will not. I don't positively say that anything which seems impossible to us today will be reality in the future, I just acknowledge that it should be naive of me to be certain that because it seems impossible to me it will never be reality.
 
Originally posted by Dean Dean wrote:

That is not what I was talking about when I made the sucking-eggs jape.
Hence my added footnote:
 
Originally posted by Dean Dean wrote:

[for future reference the idiom "teaching granny to suck eggs" would mean that Einstein would be attempting to teach Maxwell something he already knew - that is, that light was a electromagentic wave for example. It precisely does not mean he would have been teaching him something new - that is "teaching an old dog a new trick" - a different idiom with a compeltely different meaning]
 
 
Alright that may have been just a mistake of interpretation by a non-English guy, I thought that you were hinting that I was being stupid by telling you something obvious when I was saying precisely that what you seemed to consider so obvious might not be so obvious as you believed. No offence taken.
Back to Top
Dean View Drop Down
Special Collaborator
Special Collaborator
Avatar
Retired Admin and Amateur Layabout

Joined: May 13 2007
Location: Europe
Status: Offline
Points: 37575
Direct Link To This Post Posted: August 07 2013 at 11:10
Originally posted by Gerinski Gerinski wrote:

Originally posted by Dean Dean wrote:

Originally posted by Gerinski Gerinski wrote:

That the amount of observational data required to find out the regularities that could eventually be represented by a formula was "a small array" (in the early scientific period of the 16th and 17th century) is debatable. I have always understood that Kepler could only come with his laws of planetary motion thanks to having access to the vast (for the period at least) amounts of observational data collected by Tycho Brahe.
Missed the point by a country mile, but nevermind, you carry on.
Sorry for my being so obtuse and carrying on, but can you please explain again why is it incorrect to say that Kepler found a way to compress the large amounts of data collected by Tycho Brahe? I believe that after his findings, a lot of data was found to be redundant and could be thrown away without losing its informational value. The same amount of data could be reproduced with less bits of information required to store it. Why is that not compression? 
 
 
It was a small array of data compared to the amount of tablular data that Kepler was able to produce after he formulated his three laws of planetary motion, Kepler's data was also considerably more accurate because it did not contain observational error. As I said in my previous post - the data was still needed in tabular form because it was not practical to compute it afresh everytime it was needed. Also, redundant data is not compression, it is redundancy - a different thing altogether. Again - if you don't know where the planet is and what its orbit is relative to the Sun then Kepler's laws will not tell you where it will be or where it was (six pieces of data are required: three for position and three for orbital velocity) - that is not data compression since the same amount of data is needed to be put into the system for what you get out. If you have three packets of data x, y and z and apply a transformation the you get x', y' and z' - three packets of data in = three packets of data out and zero compression ensues (you can now discard x, y, and z because you can recompute them by applying a reverse transformation) - the transform is not the data - you cannot put only x and y in and hope to get  x', y' and z' out. When you then apply the transform for the second time you get x'', y''and z'' ... (you can now discard x', y' and z'). However if you wanted to get from  x, y, and z to  x'', y''and z'' then you'd either have to apply the transform twice or use a different transform. BUT each time you still need to know the previous state - the transform alone tells you nothing - Kepler's three laws do not tell you the orbital speed of Venus at perihelion or aphelion just by looking at the forumlae.
What?
Back to Top
Dean View Drop Down
Special Collaborator
Special Collaborator
Avatar
Retired Admin and Amateur Layabout

Joined: May 13 2007
Location: Europe
Status: Offline
Points: 37575
Direct Link To This Post Posted: August 07 2013 at 11:19
Originally posted by Gerinski Gerinski wrote:

Originally posted by Dean Dean wrote:

Yes clever men make assumptions that later prove to be incorrect - that goes for you as well as it does me.
The difference between you and me is that I acknowledge that, but you seem to know what will be possible and what will not. I don't positively say that anything which seems impossible to us today will be reality in the future, I just acknowledge that it should be naive of me to be certain that because it seems impossible to me it will never be reality.
Nope - that's having your cake and eating it (I love English idioms) - you extrapolate small possible things into big impossible things. I say the small things cannot be extrapolated like that - nothing in the Universe happens like that. You say you can see a photon (you can't) you say you can see a photon travelling at the same speed as you (you can't) you say when you travel at the speed of light and look back you see yourself in the past (you can't). When I say "you can't" I am pointing out a big impossible thing - that is not an naive assumption.
 
Originally posted by Gerinski Gerinski wrote:

 
Originally posted by Dean Dean wrote:

That is not what I was talking about when I made the sucking-eggs jape.
Hence my added footnote:
 
Originally posted by Dean Dean wrote:

[for future reference the idiom "teaching granny to suck eggs" would mean that Einstein would be attempting to teach Maxwell something he already knew - that is, that light was a electromagentic wave for example. It precisely does not mean he would have been teaching him something new - that is "teaching an old dog a new trick" - a different idiom with a compeltely different meaning]
 
 
Alright that may have been just a mistake of interpretation by a non-English guy, I thought that you were hinting that I was being stupid by telling you something obvious when I was saying precisely that what you seemed to consider so obvious might not be so obvious as you believed. No offence taken.
None intended - I gave the link to avoid any ambiguity.
What?
Back to Top
Gerinski View Drop Down
Prog Reviewer
Prog Reviewer
Avatar

Joined: February 10 2010
Location: Barcelona Spain
Status: Offline
Points: 5154
Direct Link To This Post Posted: August 07 2013 at 13:21
Originally posted by Dean Dean wrote:

You say you can see a photon (you can't) you say you can see a photon travelling at the same speed as you (you can't) you say when you travel at the speed of light and look back you see yourself in the past (you can't). When I say "you can't" I am pointing out a big impossible thing - that is not an naive assumption.
It was this one I was referring to:
 
Originally posted by Dean Dean wrote:

Data compression is obviously possible and simple "deflate" type compression techniques would reduce the amount fo data by a magnitude but that is not enough to provide any practical gains for encoding a human bean for transmission as data. Yes, in the future we will come up with some improved compression techniques that could raise the current compression ratios to several orders of magnitude, but that still will not be enough
 
 
Back to Top
Gerinski View Drop Down
Prog Reviewer
Prog Reviewer
Avatar

Joined: February 10 2010
Location: Barcelona Spain
Status: Offline
Points: 5154
Direct Link To This Post Posted: August 07 2013 at 13:26

Consider pi written numerically 3.14159 26535 89793 23846 26433 83279 50288 41971 69399 37510 58209 74944 59230 78164 06286 20899 86280 34825 34211 70679 ....

Do you consider that 
 
pi = C / d is a more compressed way of expressing the same information?
Back to Top
Dean View Drop Down
Special Collaborator
Special Collaborator
Avatar
Retired Admin and Amateur Layabout

Joined: May 13 2007
Location: Europe
Status: Offline
Points: 37575
Direct Link To This Post Posted: August 07 2013 at 14:16
Originally posted by Gerinski Gerinski wrote:

It was this one I was referring to:
 
Originally posted by Dean Dean wrote:

Data compression is obviously possible and simple "deflate" type compression techniques would reduce the amount fo data by a magnitude but that is not enough to provide any practical gains for encoding a human bean for transmission as data. Yes, in the future we will come up with some improved compression techniques that could raise the current compression ratios to several orders of magnitude, but that still will not be enough
 
 
I was referring to all of them, I just could not be bothered to list them all.
 
However, to this specific point:
 
Whether we  use my simple guesstimate of the number of atoms in the human body or the Leicester University calculation of the total data content the figure we arrive at is an enormous number; whether we use my simple guesstimate of the data-transmission time based upon known limitations of bandwidth and data-rates, or the calculated value from Leicester University, the time taken to transmit the data is also an enormous number. Eitherway the compression ratio required to a) get the total data content down to a managable number and b) then get the transmission time of that data down to a practical value requires an extraordinary level of compression. The problem with any data compression is that there is a finite limit beyond which information is lost with no chance of recovery - there is a finite limit of irreducibility in any set of data. Unfortunately your proposals do not constitute data compression, a formula is a transformation.
 
For example. We know we can analyse a section of waveform using fourier analysis which gives us a table of frequecies and amplitudes that can be used to reconstitute the waveform by applying a reverse transform. In principle this is an impressive level of data compression because a relatively complex time domain waveform is apparently described by a small array of frequecy domain data. But this has not compressed the data at all, it has convolved it from the time domain to the frequency domain and back again and in doing so some information of the original waveform has been lost (it's called windowing and results in sine x over x distortion) also the resulting reconstituted wave repeats continously whereas the analysed section could have changed in the next section.
 
We cannot apply a single fourier analysis to an entire piece of music and then reconstitute the same piece of music - from the analysis we would know that specific frequencies were present in the music and we would have a figure for their average amplitudes over the entire peice, but we would not know whether a specifc tone was present for the entire duration of the piece or just for a single note and we would not know the variations in amplitude of those tones through the piece. What we would get would be a continuous complex tone that lasts for the entire duration of the piece of music.
 
To encode the entire peice we would need to analyse the whole waveform continuously at twice the highest frequency we were interested in and the duration of the section we would have to analyse is one cycle of the lowest frequency we were interested in and then repeat the analysis forthe next window shifted by time interval of one sample - the resulting frequency domain data-pack is considerably larger than an equivalent time domain data-pack.
 
While this is a grossely over-simplified example of the problem it illustrates the limitations in reducing a complex non-repetitive array of data down to a simpler array of data and a formula (or set of formulae). Repetitive data is a doddle but as soon as you get non-repetitve impulse or stochastic data within that data set then you need to record each one individially - and that limits the reducibility of the data.
 
 


Edited by Dean - August 07 2013 at 14:30
What?
Back to Top
Dean View Drop Down
Special Collaborator
Special Collaborator
Avatar
Retired Admin and Amateur Layabout

Joined: May 13 2007
Location: Europe
Status: Offline
Points: 37575
Direct Link To This Post Posted: August 07 2013 at 14:21
Originally posted by Gerinski Gerinski wrote:

Consider pi written numerically 3.14159 26535 89793 23846 26433 83279 50288 41971 69399 37510 58209 74944 59230 78164 06286 20899 86280 34825 34211 70679 ....

Do you consider that 
 
pi = C / d is a more compressed way of expressing the same information?
But what if you wanted to encode  3.14159 26535 49793 23846 26433 83279 50288 41971 69399 37510 58209 74944 59230 78164 06286 20899 86280 34825 34211 70679 ... ?
 
What is the formula now?
 
Then this:  3.14159 16535 89793 23846 26433 83279 50288 41971 69399 37510 58209 74944 59230 78164 06286 20899 86280 34825 34211 70679 ...
 
and now this:  3.14159 26535 89793 23846 26433 83279 50288 41971 68399 37510 58209 74944 59230 78164 06286 20899 86280 34825 34211 70679 ...
 
Now do you understand the magnitude of the problem?
 
 
 
What?
Back to Top
Gerinski View Drop Down
Prog Reviewer
Prog Reviewer
Avatar

Joined: February 10 2010
Location: Barcelona Spain
Status: Offline
Points: 5154
Direct Link To This Post Posted: August 07 2013 at 15:13
Originally posted by Dean Dean wrote:

Originally posted by Gerinski Gerinski wrote:

Consider pi written numerically 3.14159 26535 89793 23846 26433 83279 50288 41971 69399 37510 58209 74944 59230 78164 06286 20899 86280 34825 34211 70679 ....

Do you consider that 
 
pi = C / d is a more compressed way of expressing the same information?
But what if you wanted to encode  3.14159 26535 49793 23846 26433 83279 50288 41971 69399 37510 58209 74944 59230 78164 06286 20899 86280 34825 34211 70679 ... ?
 
What is the formula now?
 
Then this:  3.14159 16535 89793 23846 26433 83279 50288 41971 69399 37510 58209 74944 59230 78164 06286 20899 86280 34825 34211 70679 ...
 
and now this:  3.14159 26535 89793 23846 26433 83279 50288 41971 68399 37510 58209 74944 59230 78164 06286 20899 86280 34825 34211 70679 ...
 
Now do you understand the magnitude of the problem?
 
 
 
Sorry but no. I know that there is uncompressable information (I already mentioned bout genuinely random data). I said that I see the laws of physics as algorithms capable of compressing vast amounts of information present in the physical world into a more reduced amount of bits (initial conditions + laws). Maybe more generally, that equations may be able of encoding information. You said they are not.
Furthermore I provided examples of things several smart people honestly considered impossible yet became reality not too long after. You continue to maintain that compressing and transmitting effectively the information present in human being is imposible and will always be. No need to discuss this one any further, we will not agree on this one.
Back to Top
Dean View Drop Down
Special Collaborator
Special Collaborator
Avatar
Retired Admin and Amateur Layabout

Joined: May 13 2007
Location: Europe
Status: Offline
Points: 37575
Direct Link To This Post Posted: August 07 2013 at 18:54
Originally posted by Gerinski Gerinski wrote:

Originally posted by Dean Dean wrote:

Originally posted by Gerinski Gerinski wrote:

Consider pi written numerically 3.14159 26535 89793 23846 26433 83279 50288 41971 69399 37510 58209 74944 59230 78164 06286 20899 86280 34825 34211 70679 ....

Do you consider that 
 
pi = C / d is a more compressed way of expressing the same information?
But what if you wanted to encode  3.14159 26535 49793 23846 26433 83279 50288 41971 69399 37510 58209 74944 59230 78164 06286 20899 86280 34825 34211 70679 ... ?
 
What is the formula now?
 
Then this:  3.14159 16535 89793 23846 26433 83279 50288 41971 69399 37510 58209 74944 59230 78164 06286 20899 86280 34825 34211 70679 ...
 
and now this:  3.14159 26535 89793 23846 26433 83279 50288 41971 68399 37510 58209 74944 59230 78164 06286 20899 86280 34825 34211 70679 ...
 
Now do you understand the magnitude of the problem?
 
 
 
Sorry but no. I know that there is uncompressable information (I already mentioned bout genuinely random data). I said that I see the laws of physics as algorithms capable of compressing vast amounts of information present in the physical world into a more reduced amount of bits (initial conditions + laws). Maybe more generally, that equations may be able of encoding information. You said they are not.
What you actually said was "Only genuinely random data can not be compressed." and I replied that some non-sequential patterns cannot be compressed. Sequential data is compressible - pi, for all its irrational-ness is not random, it is predictable and can be expressed as you said, as C / d [or as 2 x arccos(0)] ... and since it is based upon circles and cosine waves it is actually sequential - not the digits in the sequence, but the value itself (and as data we are concerned with the value of the data, not its sequence of digits: the sequence of digits is immaterial - if we convert the numeric base the sequence is different but the value remains the same) -  so  the same formula produces the same value each and every time. Until, that is, I introduce a series of offsets, then it becomes C / d +k1 and C / d + k2 and C / d + k3 and what was compressed data is now a lot less compressed because there is no formula for predicting the values of k1,k2 and k3... we have to send the value of C and the value of d and each value of kn for each of the n times that the value differs from pi - our transmission packet is increasing not decreasing.
 
For the final time (because repitition is tedious):  {initial conditions + laws} is not data compression - the data that defines the initial conditions is the same amount of data that the formula will produce - the net gain is zero, the net loss is zero - what defines the final result is not the law but the intial condition - change the initial conditions and the output changes - that's what formulas are for - calculating a new value based upon a new input. There is no formula known to man nor beast nor gods that can create data without feeding data in.
 
The laws of physics are not algorythms for compressing the vast amount of information present in the physical world into more reduced amount of data because you have to apply the algorythm to each individual piece of information to produce a new piece of information.
Originally posted by Gerinski Gerinski wrote:

Furthermore I provided examples of things several smart people honestly considered impossible yet became reality not too long after. You continue to maintain that compressing and transmitting effectively the information present in human being is imposible and will always be. No need to discuss this one any further, we will not agree on this one.
I don't hold smart people in awe so I'm not surprised when they make pronouncements that later prove to be false (I am also wary of quotations taken out of context that make them look foolish - though Kelvin's hubris was common knowledge even in his day) and I would love to be proven wrong on human (tele)transportation but there are many things that really clever people have said are impossible that will forever remain impossible. So, for the record I will state categorically that: The amount of data required to transmit a human being will not be reduced to a number small enough to be transmitted in an acceptible time-frame even when using the entire electromagnetic spectrum.
 
What?
Back to Top
Dean View Drop Down
Special Collaborator
Special Collaborator
Avatar
Retired Admin and Amateur Layabout

Joined: May 13 2007
Location: Europe
Status: Offline
Points: 37575
Direct Link To This Post Posted: August 08 2013 at 02:55
There are 150 million publications in the British Library, if the average book contains 100,000 words and the average number of letters per word is 7; then the total number of letters in all the books in all the British Library would approximate to 1  x 1014  .... transmitting that at 1TB would take a mere 8½ days. If we had an algorythm that could compress that to a single letter and applied that routine to the human data packet  it would still take 8,622 years to teleport a human.
 
Now tell me it will someday be possible to compress all the books in the British Library to single 8-bit ASCII Character. Geek
What?
Back to Top
Gerinski View Drop Down
Prog Reviewer
Prog Reviewer
Avatar

Joined: February 10 2010
Location: Barcelona Spain
Status: Offline
Points: 5154
Direct Link To This Post Posted: August 08 2013 at 04:18
Originally posted by Dean Dean wrote:

For the final time (because repitition is tedious):  {initial conditions + laws} is not data compression - the data that defines the initial conditions is the same amount of data that the formula will produce - the net gain is zero, the net loss is zero - what defines the final result is not the law but the intial condition - change the initial conditions and the output changes - that's what formulas are for - calculating a new value based upon a new input. There is no formula known to man nor beast nor gods that can create data without feeding data in.
 
The laws of physics are not algorythms for compressing the vast amount of information present in the physical world into more reduced amount of data because you have to apply the algorythm to each individual piece of information to produce a new piece of information.
Sorry it might not have been the final time... We don't know what the actual laws of physics are like. I doubt that they are fully deterministic in the way we think of the term so that every algorythm is a simple 'one input = one output with the same information content'.
It is hard to think that a superhot rather small ball of quark-gluon plasma contained already as much information contents as the current extremely complex universe with its ~ 100 naturally occurring elements in all their multiple combinations and forms, and all the information present in you, me and every other living organism, and yet we tend to believe that that plasma ball evolved into the current universe via some set of unknown laws and principles which somehow produced the emergence of all the complexity we see today, without external additional input along the way (I let alone the role of randomness, indeed if randomness has a significant impact in the evolution of the universe, the same initial conditions and the same laws may result in completely different universes. It still seems to me that the rather simple initial conditions + the laws contain less information than the ultra-complex final results).


Edited by Gerinski - August 08 2013 at 04:19
Back to Top
Dean View Drop Down
Special Collaborator
Special Collaborator
Avatar
Retired Admin and Amateur Layabout

Joined: May 13 2007
Location: Europe
Status: Offline
Points: 37575
Direct Link To This Post Posted: August 08 2013 at 18:11
Originally posted by Gerinski Gerinski wrote:

Originally posted by Dean Dean wrote:

For the final time (because repitition is tedious):  {initial conditions + laws} is not data compression - the data that defines the initial conditions is the same amount of data that the formula will produce - the net gain is zero, the net loss is zero - what defines the final result is not the law but the intial condition - change the initial conditions and the output changes - that's what formulas are for - calculating a new value based upon a new input. There is no formula known to man nor beast nor gods that can create data without feeding data in.
 
The laws of physics are not algorythms for compressing the vast amount of information present in the physical world into more reduced amount of data because you have to apply the algorythm to each individual piece of information to produce a new piece of information.
Sorry it might not have been the final time... We don't know what the actual laws of physics are like. I doubt that they are fully deterministic in the way we think of the term so that every algorythm is a simple 'one input = one output with the same information content'.
It is hard to think that a superhot rather small ball of quark-gluon plasma contained already as much information contents as the current extremely complex universe with its ~ 100 naturally occurring elements in all their multiple combinations and forms, and all the information present in you, me and every other living organism, and yet we tend to believe that that plasma ball evolved into the current universe via some set of unknown laws and principles which somehow produced the emergence of all the complexity we see today, without external additional input along the way (I let alone the role of randomness, indeed if randomness has a significant impact in the evolution of the universe, the same initial conditions and the same laws may result in completely different universes. It still seems to me that the rather simple initial conditions + the laws contain less information than the ultra-complex final results).
Only because it is a gross over-simplification that has reduced the argument to the absurd resulting in it having no valid meaning.
 
Yes, a set of simple Laws of Physics has created a complex Universe, but  the amount of matter has not increased - what has changed is how the matter is arranged, it has become more complex and the data that fundamentally describes it is unchanged. Each "piece" of matter can be described by four basic physical dimensions - length (distance), mass, time and charge - with those we can determine (ie derive) all the physical properties of the "piece" of matter such as its location, velocity, momentum, energy, temperature, etc... and that is a fixed amount of data relating to each "piece" of matter, even when the matter is at rest it still has the properties of velocity and momentum, the values of those properties just happens to be zero (and we need to know that they are zero to know that it is at rest). So when we move it (ie apply a transform) we create some new values for that data - it now has velocity and momentum - but we have not increased the amount of data associated with it, it's just that some of those parameters now have values. To move it we had to put something "in", we had to act upon it with a force that comes from another piece of matter that also has a fixed amount of data associated with it. The transform we are applying is transfering some of the data values from mass B to mass A, in this case the momentum of mass A is obtained from mass B - the amount of data associated with mass A and mass B is unchanged so the number of bits of data needed to describe each mass is unchanged. The laws of physics are not transforming the matter, they are transforming the properties of the matter, and those properties are data values - they do not create new data, (the amount of data is unchanged, its values change), hey transform old data values into new data values.
 
The information that is associated with matter only came into existence when matter came into existence, without the matter there is no information. If you wanted to be really pedantic about it you could even argue that more data was annialated in the quark-epoch than survived it. I could care less whether the Universe is deterministic or not, metaphysical philosophical bullcrap does not interest me, whether it is or isn't changes nothing - we experience one and only one ever-changing universe - whether it could have produced something different is irrelevant, it produced this one and this is the only one we know. As I have shown several times before, unpredictability does not require random input:
 
 
 
However, all this is a futile digression that avoids the issue.
 
It would be easier if you would simply stick to the point of explaining how the complex arrangement of atoms that form a human bean can be described by a set of forumlae and a set of reduced data such that they can be used to reconstitute a replica of the human that is exact in every way (including all memories and personality), because at present you have not provided any evidence to suggest that it is ever going to be possible, in fact your example of the evolution of the Universe and its laws suggests that it will not produce an exact replica. [It is unlikely that the same rules and initial conditions would even result in a human, let alone one specific human out of 108 billion possible humans (the estimated total number of humans who have ever lived)].
 
 
What?
Back to Top
Gerinski View Drop Down
Prog Reviewer
Prog Reviewer
Avatar

Joined: February 10 2010
Location: Barcelona Spain
Status: Offline
Points: 5154
Direct Link To This Post Posted: August 09 2013 at 07:40
Originally posted by Dean Dean wrote:

 
It would be easier if you would simply stick to the point of explaining how the complex arrangement of atoms that form a human bean can be described by a set of forumlae and a set of reduced data such that they can be used to reconstitute a replica of the human that is exact in every way (including all memories and personality), because at present you have not provided any evidence to suggest that it is ever going to be possible, in fact your example of the evolution of the Universe and its laws suggests that it will not produce an exact replica. [It is unlikely that the same rules and initial conditions would even result in a human, let alone one specific human out of 108 billion possible humans (the estimated total number of humans who have ever lived)].
 
If I knew that I wouldn't be typing in this forum but giving conferences at MIT or CERN Tongue
But I already put the example of blood as example of features which do not need to be atom-by-atom identical replicas to ensure a perfectly viable reproduction. If the hair in my arms is not precisely identical than the one I have now, that's not gonna change who I am. I can shave it completely and I will still be me. You don't need to specify my head hair atom by atom, just the type of hair I have, the area covered and the hair density in each rough area and I will not notice any difference, it will still be me, if I am reconstructed somewhere else with my hair slightly different, who cares?
I could even loose one finger and it will still be me (although I would not happily accept that as a successful teleportation). I can have a 5 cm shorter bowel and it will still be me. There's so much of me which need not be an identical atom-by-atom copy in order to still be me without making any practical difference. A lot of my body, just giving my DNA and my 3-D dimensional measures and a few relevant features could be reconstructed with enough accuracy. Knowing how a leg is, my legs 3-D dimensions and maybe a few more features my legs could be reconstructed, if this or that vein is 1 millimeter deeper or more to the surface of my skin, it will still be me.
Of course some variations could alter important traits, affecting for example my life expectancy, those should not be allowed to be changed.
Again I'm certainly not maintaining that it will be eventually possible to teletransport a human, far from it! I just argued that the argument of the posted calculations of the information which would be required for that feat, based on the premise that there's no other way than specifying the atomic configuration of the human in every detail, I don't think that is a valid argument.
Back to Top
 Post Reply Post Reply Page  <1 1314151617 23>

Forum Jump Forum Permissions View Drop Down



This page was generated in 0.137 seconds.
Donate monthly and keep PA fast-loading and ad-free forever.