RFP: Tokenomics 2.0 Proposal "Adaptive Capital"

Tokenomics 2.0 Implementation Proposal: “Adaptive Capital”

Background

We are essentially proposing an ecosystem level partnership between Regen Network and Protocol Labs through the mutual funding of a Tokenomics DAO. This comes from our teams desire to bridge the gap between research and implementation.

The proposal below outlines the current state of our implementation plan and we are soliciting discussion and feedback for further advancement. There is plenty of scope for improvement, which we trust will happen as we advance our work. We believe that the proposal submitted to PL can help lay strong research foundations; while Regen is fertile soil for technical implementation that contributes to grounded impact in real communities.

Specifications

Regen Registry defines Nature Based Solutions in the following manner:

Actions to protect, regeneratively manage and restore natural or modified ecosystems that address societal challenges effectively and adaptively, simultaneously providing human well-being and biodiversity benefits”.

“Adaptive Capital” is subsequently a proposition for valuing living ecosystems through multi-perspective architecture and the free energy principal; a means of evaluating quantitive measures with qualitative outcomes. For such a project to find traction, we need to consider the existential domains of markets, science and technology; while holding in mind the essential values of people, planet and protocols. Let’s explore our assumptions about these domains:

Market Assumptions
Markets are seeking pathways for divestment from degenerate commodities to more regenerative investment vehicles. However the divestment is constrained by a lack of said pathways and it’s estimated that there are two trillion dollars of stranded assets trading on Wall St. books. In the authors opinion this number represents a transformation opportunity thats global in scale provided

Protocol Labs RFP-X Research Proposal: Semantic Impact Markets for Practical Language Evolution

LunarPunk_Labs - University of the Third Horizon

  1. Research question: How do we increase relevance on internet to aid the realisation of knowledge into understanding while effectively bridging the gap from research to implementation?
  2. Research problem statement: To keep up with the worlds rising complexity we’ll need to deepen our human capacities for representation and interpretation to drive humanity forward by augmenting our intellect through breakthroughs in computing.
  3. Proposed solution: We are proposing to act as coordinators in what might amount to an ecosystem level partnership between Protocol Labs and Regen Ledger to bootstrap semantically composable Impact Markets and ‘Adaptive capital’ via a Tokenomics csDAO on Regen. The mechanism we are suggesting is a content delivery network that systematically classifies parametric groupings as Friston blankets to enable interoperable eco-credits represented as generalised ‘hypercerts’. Framing model metadata as functional geometries to group such parameters provides epistemological transparency; a necessity for both governance and scientific research.

Problem Framing as Macro Economic Hypothesis

Before innovating on such financial technology, let’s articulate some working assumptions about the market in order to progressively approximate some enabling constraints. Let’s start with the major market participants and then examine the forces which move them.

Participants (Who)

  • Alpha investors; risk-on. Seeks return on investment. Quantifiers.
  • Beta investors; risk-off. Seeks wealth preservation. Qualifier

I want to clarify, is this a submission to the Regen tokenomics 2.0 RFP?

Is the author a representative of Protocol Labs?

How about a tl;dr?

Enjoyed reading and following the links. Interesting concepts. Thank you for posting/sharing.

I was surprised to see that the milestones are projected out to 2026 and wondering what is Regen’s timeline/urgency for seeing tokenomics solutions implemented. What are some of the deliverables that could come sooner?

Also, I think it is important to consider, with any product/solution, who will be using it, what is the UX and will a significant amount of users participate/use the product/solutions produced.

What are the pain points and how is this solution the best pain killer?

The pain seems to be, for those who are interested in tracking ecological states, sourcing reliable data about the state and actions effected on the state, and incentivizing positive actions through token markets.

So the solution is to create a 3 or 4 point graph system that aggregates various data collections into a systematic, compossible, semantic language that can be used to describe ecological states and actions within them. The pain killer is that we would be able to easily converse, assess, compute, and trust the states of ecological systems using these digital representations, and enable funds to flow to where the most optimal outcomes are being produced. Yes? More or less?

How is this different and/or similar to using data structure standards such as JSON-LD or would it be part of the stack?

How is this different and/or similar to ixo’s approach to creating a ‘digital twin’ of ecological state and measuring impact actions?

Of course, one of the major differences is that your proposed solution is to build this tokenomics/eco-state system into Regen’s EcoCredits ecosystem / tech stack.

Again, I’m interested in hearing more about the user journeys. Who are the actors, what is the story for each of them as they interact with the many facets of this Adaptive Capital solution.

Thanks!

Josh, I’m sure the above is really good, but I couldn’t really follow along.

Perhaps I’m simple-minded, but I come into this tokenomics discussion with a simple and clear outcome on my mind, which is the increase in the value of the REGEN token, and if possible, that REGEN token value increase is directly correlated to on chain actions that relate to regeneration and positive ecological impact.

I am worried this thread has taken the tokenomics discussion and buried it under a huge pile of complexity (regardless of how impressive and well thought out it might be). Perhaps we continue with this thread, but I will also start another that aims to get the communities ideas for swift and simple tokenomics tweeks that could turn around the general trend of the REGEN token toward nil value.