YouTube

Got a YouTube account?

New: enable viewer-created translations and captions on your YouTube channel!

English subtitles

← 7.11 Applications of Logical Depth to Image Classification

Get Embed Code
1 Language

Download

Showing Revision 2 created 02/04/2019 by cathcaptioner.

  1. In this lecture we will see how measures
    motivated or based on logical depth

  2. derived from CTM can find
    some interesting applications,
  3. even when logical depth
    is a much more difficult measure
  4. than althorithmic probability
    and algorithmic complexity
  5. because logical depth is neither lower
    or upper semicomputable.
  6. So one can never know whether one
    is really approximating it or not.
  7. Yet one can define frameworks in which
    logical depth is very interesting to apply.
  8. We will first see how approximating logical depth
  9. by using the compression times
    with lossless compression algorithms
  10. is something that has not been done before
  11. and it is capable of producing
    some interesting results.
  12. We will also see how the compression times
    and making perturbations in some of these experiments
  13. conform with the theoretical expectations
    of logical depth.
  14. Remember that the main idea behind logical depth
  15. is that the decompression instructions
    for trivial or random objects
  16. are very simple to follow
    and are therefore executed very fast
  17. with the compression process
    taking only a small amount of time
  18. simply because the compression algorithm
  19. either compresses very well in the case of simple objects
  20. or simply does not compress at all for random objects
  21. and instructions for the compressing random objects
    are almost nonexistent.
  22. Longer run times, however, are the result of a process
  23. following a set of time-consuming decompression instructions,
  24. hence a deep, complex process according to logical depth.
  25. The decompression time of the compressed version of a string
  26. can be considered an estimation of Bennett's logical depth
    based on the algorithm used.
  27. There seems to have been no previous attempt
    to implement an application of ideas
  28. based on Bennett's logical depth or using other compression times.
  29. But we implemented a simple framework to test some ideas
  30. and they turned out to be very interesting.
  31. What we did was to assess the feasibility
    of an application of the concept
  32. to the problem of image characterization
    and classification by logical depth.
  33. We performed a series of experiments
    of gradually increasing sophistication
  34. starting from fully controlled experiments
  35. and proceeding to the use of the
    best-known compression algorithms
  36. over a large data set.
  37. The first experiments consisted
    in controlling all the parameters involved.
  38. For example, adding random lines
    would first increase the structure, but weakly,
  39. even before reaching half the image,
  40. of full of random black pixels and white pixels,
  41. back to low logical depth because the image
    started looking random and finally simple.
  42. The battery of tests involved a series of images
  43. devised to tune and verify different aspects of the methodology
  44. and a more realistic data set was also used,
  45. indicating whether the results were stable enough
  46. to yield their same values for replication purposes
  47. and whether they were consistent with the theory
  48. and consonant with an intuitive sense
  49. of complex or sophisticated versus a simple object.
  50. Here we have a classification of images by compression length
  51. using a compression algorithm called png croche
  52. which is a lossless compression algorithm that compresses png images.
  53. And the results are very stable
    when you see another compression algorithm
  54. such as bzip2 and Compress.
  55. The png format is in some way ideal
    because png is a lossless compression format,
  56. unlike other formats such as jpg that store images
    by applying lossy compression.
  57. In other words, png images do not lose
    any information of the image.
  58. Here is the ranking by the compression times.
  59. The execution time was given by the mathematical function timing.
  60. The function timing evaluates an expression
    and returns a list of the time taken in seconds,
  61. together with the result obtained.
  62. The function includes only CPU time spent
    in the mathematical kernel.
  63. The fact that several processes run concurrently
    in computing systems
  64. as part of their normal operation
  65. is one of the greatest challenges faced
  66. in attempting to measure with accuracy
    the time that the compression process takes,
  67. even when it is isolated from all other computer processes.
  68. This instability is due to the fact that
    the internal operation of the computer comes in several layers,
  69. mostly at the operating system level.
  70. So we killed all processes and ran the experiments
    after dummy warm-up computations,
  71. yielding rather stable results on the same computer.
  72. One can see the differences in the way
    in which compression lengths
  73. and the compression times produced
    results from the last two figures.
  74. And they are in agreement with the theory
  75. assigning low compression and low decompression time
  76. to random and simple objects,
  77. but high logical depth to some objects in the middle.
  78. One can see the differences as they were theoretically expected.
  79. On the left we have the algorithmic complexity
    to logical depth mapping of each image,
  80. and on the right we have groups by logical depth
    based on the decompression times.
  81. So we can see how simple images remained shallow for logical depth,
  82. but random images that were incompressible,
  83. and thus ranked high according to algorithmic complexity,
  84. were ranked low for logical depth,
  85. thereby exhibiting everything that has more structure
  86. as logical depth would quantify.
  87. The groups that resonate very much with intuition
    of what is complex in the sense of a structure or sophistication
  88. processes that require computational time to be produced or generated
  89. versus other objects that do not require such computation, such as simple or random objects,
  90. were found exactly in the middle, as predicted by logical depth.
  91. In the next lesson we will see how many of these tools and measures,
    in particular algorithmic probability,
  92. can make serious contributions
    to areas of machine learning and artificial intelligence.