https://www.selleckchem.com/products/ci994-tacedinaline.html Examples are provided using the affordances contain-ability, sit-ability, and support-ability.Historically, neuroscience principles have heavily influenced artificial intelligence (AI), for example the influence of the perceptron model, essentially a simple model of a biological neuron, on artificial neural networks. More recently, notable recent AI advances, for example the growing popularity of reinforcement learning, often appear more aligned with cognitive neuroscience or psychology, focusing on function at a relatively abstract level. At the same time, neuroscience stands poised to enter a new era of large-scale high-resolution data and appears more focused on underlying neural mechanisms or architectures that can, at times, seem rather removed from functional descriptions. While this might seem to foretell a new generation of AI approaches arising from a deeper exploration of neuroscience specifically for AI, the most direct path for achieving this is unclear. Here we discuss cultural differences between the two fields, including divergent priorities that should be considered when leveraging modern-day neuroscience for AI. For example, the two fields feed two very different applications that at times require potentially conflicting perspectives. We highlight small but significant cultural shifts that we feel would greatly facilitate increased synergy between the two fields.In computational neuroscience, spiking neurons are often analyzed as computing devices that register bits of information, with each action potential carrying at most one bit of Shannon entropy. Here, I question this interpretation by using Landauer's principle to estimate an upper limit for the quantity of thermodynamic information that can be processed within a single action potential in a typical mammalian neuron. A straightforward calculation shows that an action potential in a typical mammalian cortical pyramidal cell can proce