**no edits**or

**log actions**made within the last 60 days. If you are a user (who is not the bureaucrat) that wishes for this wiki to be reopened, please request that at Requests for reopening wikis. If this wiki is not reopened within 6 months it may be deleted. Note: If you are a bureaucrat on this wiki, you can go to Special:ManageWiki and uncheck the "Closed" box to reopen it.

# Information

Information.

What is it, and how do we relate to it.

there is a problem with the shannon definition on the atomic and axiomatic level. it refers to subjective manipulation of information http://philsci-archive.pitt.edu/10911/1/What_is_Shannon_Information.pdf

https://web.mit.edu/6.933/www/Fall2001/Shannon2.pdf our definition of information is related to both von neumann entropy and shannon information entropy https://en.wikipedia.org/wiki/Von_Neumann_entropy https://en.wikipedia.org/wiki/Entropy_(information_theory)#Relationship_to_thermodynamic_entropy

through this can be measured in the brain, through pathways. this means memetic innoculation can be inbuilt and trained

use of bayesian and time updating will lead to algorithmic information https://en.wikipedia.org/wiki/Algorithmic_information_theory https://en.wikipedia.org/wiki/Algorithmic_probability relates to information and probability space. of informational events. this is a good primer on the post bayes algo theory http://www.talkorigins.org/faqs/information/algorithmic.html this too states more on algorithmic information theory itself: https://arxiv.org/pdf/cs/0703024.pdf states that it has notions of randomness, which were not found in shannons work. and stated above, fisher information : https://en.wikipedia.org/wiki/Fisher_information - still need to disseminate this, would need to be considered as paradigms of prior and posterior. the other wiki post on this has more of a atomic definition which is less like shannons : https://en.wikipedia.org/wiki/Information

the brain as constant predicting states of kalman filters. ^{[1]}

kalman interfaces for brain interfaces see : https://www.researchgate.net/figure/Black-color-represents-basic-Kalman-filter-graphical-model-blue-arrow-represents-an_fig2_262674561

also the use of bayesian networks ^{[2]}for neural nets also shows some form of information abstraction as a method of informational use.

good primer on GANs : https://www.cs.ubc.ca/~murphyk/Bayes/bnintro.html

still need to cover overlap of computation and abstraction : https://en.wikipedia.org/wiki/Theory_of_computation
see reference on fodor and dennett
https://en.wikipedia.org/wiki/Computational_theory_of_mind

hence why we use the notion of entropy, rather than information itself, as information has connotations, that have issues on the definitional aspect of memetics, information as a subset of entropy. information as also objective, but not the abstraction of said information is objective.