EVE Frontier Builder Complete Course
Approximately 2 hours per lesson, 54 lessons total: 36 foundational chapters + 18 practical examples = 108 hours of complete learning content.
📖 Chapter Roadmap (Recommended Learning Order, ~2 hours per lesson)
Prerequisite
| Chapter | File | Summary |
|---|---|---|
| Prelude | chapter-00.md | Understand EVE Frontier first: What players compete for, why structures matter, how location/combat/logistics/economy form complete gameplay |
Phase 1: Getting Started (Chapter 1-5)
| Chapter | File | Summary |
|---|---|---|
| Chapter 1 | chapter-01.md | EVE Frontier architecture overview: Three-layer model, smart component types, Sui/Move selection |
| Chapter 2 | chapter-02.md | Development environment setup: Sui CLI, EVE Vault, test asset acquisition and minimal validation |
| Chapter 3 | chapter-03.md | Move contract basics: Modules, Abilities, object ownership, Capability/Witness/Hot Potato |
| Chapter 4 | chapter-04.md | Smart component development and on-chain deployment: Characters, network nodes, complete turret/stargate/storage unit modification workflow |
| Chapter 5 | chapter-05.md | dApp frontend development: dapp-kit SDK, React Hooks, wallet integration, on-chain transactions |
Companion examples: Example 1 Turret whitelist, Example 2 Stargate toll booth
Phase 2: Builder Engineering Loop (Chapter 6-10)
| Chapter | File | Summary |
|---|---|---|
| Chapter 6 | chapter-06.md | Builder Scaffold entry point: Project structure, smart_gate architecture, compilation and publishing |
| Chapter 7 | chapter-07.md | TS scripts and frontend: helper.ts, script workflows, React dApp templates |
| Chapter 8 | chapter-08.md | Server-side coordination: Sponsored Tx, AdminACL, on-chain and off-chain collaboration |
| Chapter 9 | chapter-09.md | Data retrieval: GraphQL, event subscriptions, indexer approaches |
| Chapter 10 | chapter-10.md | dApp wallet integration: useConnection, sponsored transactions, Epoch handling |
Companion examples: Example 4 Quest unlocking system, Example 11 Item rental system
Phase 3: Advanced Contract Design (Chapter 11-17)
| Chapter | File | Summary |
|---|---|---|
| Chapter 11 | chapter-11.md | Deep dive into ownership models: OwnerCap, Keychain, Borrow-Use-Return, delegation |
| Chapter 12 | chapter-12.md | Move advanced: Generics, dynamic fields, event systems, Table and VecMap |
| Chapter 13 | chapter-13.md | NFT design and metadata management: Display standard, dynamic NFTs, Collection patterns |
| Chapter 14 | chapter-14.md | On-chain economic system design: Token issuance, decentralized markets, dynamic pricing, vaults |
| Chapter 15 | chapter-15.md | Cross-contract composability: Calling other Builders’ contracts, interface design, protocol standards |
| Chapter 16 | chapter-16.md | Location and proximity systems: Hashed locations, proximity proofs, geographic strategy design |
| Chapter 17 | chapter-17.md | Testing, debugging, and security audits: Move unit tests, vulnerability types, upgrade strategies |
Companion examples: Example 3 On-chain auction house, Example 6 Dynamic NFTs, Example 7 Stargate logistics network, Example 9 Cross-Builder protocol, Example 13 Subscription-based pass, Example 14 NFT staking and lending, Example 16 NFT crafting and decomposition, Example 18 Cross-alliance diplomatic treaties
Phase 4: Architecture, Integration, and Products (Chapter 18-25)
| Chapter | File | Summary |
|---|---|---|
| Chapter 18 | chapter-18.md | Multi-tenancy and game server integration: Tenant model, ObjectRegistry, server-side scripts |
| Chapter 19 | chapter-19.md | Full-stack dApp architecture design: State management, real-time updates, multi-chain support, CI/CD |
| Chapter 20 | chapter-20.md | In-game integration: Overlay UI, postMessage, game event bridging |
| Chapter 21 | chapter-21.md | Performance optimization and Gas minimization: Transaction batching, read-write separation, off-chain computation |
| Chapter 22 | chapter-22.md | Move advanced patterns: Upgrade-compatible design, dynamic field extensions, data migration |
| Chapter 23 | chapter-23.md | Publishing, maintenance, and community collaboration: Mainnet deployment, Package upgrades, Builder collaboration |
| Chapter 24 | chapter-24.md | Troubleshooting handbook: Common Move/Sui/dApp error types and systematic debugging methods |
| Chapter 25 | chapter-25.md | From Builder to product: Business models, user growth, community operations, progressive decentralization |
Companion examples: Example 5 Alliance DAO, Example 12 Alliance recruitment, Example 15 PvP item insurance, Example 17 In-game overlay implementation
🔬 Phase 5: World Contract Source Code Deep Dive (Chapter 26-32)
Based on real source code from world-contracts, deep analysis of EVE Frontier core system mechanisms.
| Chapter | File | Summary |
|---|---|---|
| Chapter 26 | chapter-26.md | Complete access control analysis: GovernorCap / AdminACL / OwnerCap / Receiving pattern |
| Chapter 27 | chapter-27.md | Off-chain signing × on-chain verification: Ed25519, PersonalMessage intent, sig_verify deep dive |
| Chapter 28 | chapter-28.md | Location proof protocol: LocationProof, BCS deserialization, proximity verification implementation |
| Chapter 29 | chapter-29.md | Energy and fuel systems: EnergySource, Fuel consumption rate calculation, known bug analysis |
| Chapter 30 | chapter-30.md | Extension pattern implementation: Official tribe_permit + corpse_gate_bounty deep dive |
| Chapter 31 | chapter-31.md | Turret AI extensions: TargetCandidate, priority queue, custom AI development |
| Chapter 32 | chapter-32.md | KillMail system: PvP kill records, TenantItemId, derived_object anti-replay |
Companion examples: Example 8 Builder competition system, Example 10 Comprehensive implementation
🔐 Phase 6: Wallet Internals and Future (Chapter 33-35)
After mastering wallet integration and dApps, dive deeper into wallet internals and future directions for a smoother learning curve.
| Chapter | File | Summary |
|---|---|---|
| Chapter 33 | chapter-33.md | zkLogin principles and design: Zero-knowledge proofs, FusionAuth OAuth, Enoki salt, ephemeral key pairs |
| Chapter 34 | chapter-34.md | Technical architecture and deployment: Chrome MV3 five-layer structure, Keeper security container, messaging protocol, local build |
| Chapter 35 | chapter-35.md | Future outlook: Zero-knowledge proofs, fully decentralized games, EVM interoperability |
Recommended companion: After completing this phase, review Example 17’s wallet connection, signing, and in-game integration workflow.
🛠 Example Index (By Complexity, 2 hours each)
Main roadmap distributed above; index by complexity below for topic selection and review.
Beginner Examples (Example 1-3) — Basic Component Applications
| Example | File | Technical Highlights |
|---|---|---|
| Example 1 | example-01.md | Turret whitelist: MiningPass NFT + AdminCap + admin dApp |
| Example 2 | example-02.md | Stargate toll booth: Vault contract + JumpPermit + player ticket dApp |
| Example 3 | example-03.md | On-chain auction house: Dutch auction pricing + auto-settlement + real-time countdown dApp |
Intermediate Examples (Example 4-7) — Economy and Governance
| Example | File | Technical Highlights |
|---|---|---|
| Example 4 | example-04.md | Quest unlocking system: On-chain bit flags + off-chain monitoring + conditional stargate |
| Example 5 | example-05.md | Alliance DAO: Custom Coin + snapshot dividends + weighted governance voting |
| Example 6 | example-06.md | Dynamic NFTs: Evolvable equipment with real-time metadata updates based on game state |
| Example 7 | example-07.md | Stargate logistics network: Multi-hop routing + Dijkstra pathfinding + dApp |
Advanced Examples (Example 8-10) — System Integration
| Example | File | Technical Highlights |
|---|---|---|
| Example 8 | example-08.md | Builder competition system: On-chain leaderboard + points + automatic trophy NFT distribution |
| Example 9 | example-09.md | Cross-Builder protocol: Adapter pattern + multi-contract aggregated marketplace |
| Example 10 | example-10.md | Comprehensive implementation: Space resource war (integrating characters/turrets/stargates/tokens) |
Extended Examples (Example 11-15) — Finance and Productization
| Example | File | Technical Highlights |
|---|---|---|
| Example 11 | example-11.md | Item rental system: Time-locked NFTs + deposit management + early return refund |
| Example 12 | example-12.md | Alliance recruitment: Application deposit + member voting + veto power + auto NFT issuance |
| Example 13 | example-13.md | Subscription-based pass: Monthly/quarterly packages + transferable Pass NFT + renewal |
| Example 14 | example-14.md | NFT staking and lending: 60% LTV + 3% monthly interest + overdue liquidation auction |
| Example 15 | example-15.md | PvP item insurance: Purchase policy + server-signed claims + payout pool |
Advanced Extended Examples (Example 16-18) — Innovative Gameplay
| Example | File | Technical Highlights |
|---|---|---|
| Example 16 | example-16.md | NFT crafting and decomposition: Three-tier item system + on-chain randomness + consolation prize mechanism |
| Example 17 | example-17.md | In-game overlay implementation: In-game toll booth + postMessage + seamless signing |
| Example 18 | example-18.md | Cross-alliance diplomatic treaties: Dual-signature activation + deposit constraint + breach evidence and penalties |
📖 Reading Recommendations
| Phase | Content | Recommendation | Duration |
|---|---|---|---|
| Getting Started | Prelude → Chapter 1-5 → Example 1, 2 | Build gameplay intuition first, then architecture, components, and minimal loop | ~16h |
| Engineering Loop | Chapter 6-10 → Example 4, 11 | Complete the Builder end-to-end workflow first | ~14h |
| Advanced Contracts | Chapter 11-17 → Example 3, 6, 7, 9, 13, 14, 16, 18 | Return to strengthen contract design skills | ~30h |
| Architecture & Products | Chapter 18-25 → Example 5, 12, 15, 17 | Focus on long-term maintenance, game integration, and productization | ~24h |
| Source Code Deep Dive | Chapter 26-32 → Example 8, 10 | Reverse-engineer design philosophy from World core modules, then build complex systems | ~18h |
| Wallet Internals & Future | Chapter 33-35 | Deep understanding of EVE Vault internals and future directions | ~6h |
Recommended Learning Paths
Quick Builder Start (Shortest Path, ~26h): Prelude → Chapter 1-4 → Example 1-2 → Chapter 6-10 → Example 4
Complete Builder Path (~96h): Prelude → Chapter 1-5 → Example 1-2 → Chapter 6-10 → Example 4, 11 → Chapter 11-17 → Example 3, 6, 7, 9, 13, 14, 16, 18 → Chapter 18-25 → Example 5, 12, 15, 17 → Chapter 26-32 → Example 8, 10 → Chapter 33-35
Source Code Researcher Path (~32h): Prelude → Chapter 3 → Chapter 11 → Chapter 15 → Chapter 26-32 → Example 8, 10 → Chapter 6-10
📚 Reference Resources
- Official builder-documentation
- builder-scaffold
- World Contracts Source Code
- Sui Documentation
- Move Book
- EVE Frontier dapp-kit API
- Sui GraphQL IDE (Testnet)
- EVE Frontier Discord
- Glossary
Glossary
This page provides unified explanations for frequently occurring terms that appear across multiple chapters. When reading Chapters 26-35 and Examples 11-18, use this page as a quick reference guide.
AdminACL
An authorization control object for server-side access in World contracts. The game server or Builder backend writes approved sponsor addresses into AdminACL, and on-chain logic verifies whether the caller has “server representative” identity through validation functions like verify_sponsor.
OwnerCap
An ownership credential for objects or structures. Many World-side permission checks don’t just look at ctx.sender(), but require the caller to explicitly hold an OwnerCap associated with the target object.
AdminCap
An admin capability object within a Builder’s own module. It’s typically issued to the publisher during init and used to configure settings, modify rules, pause functionality, or withdraw funds.
Typed Witness
A pattern that uses the type system to tighten authorization boundaries. EVE Frontier’s Gate / Turret / Storage Unit extensions often use it to restrict “only specific modules and specific entry points” from calling sensitive APIs.
Shared Object
A shared object on Sui that can be concurrently accessed by multiple parties. World structures like Gates, Storage Units, and Registries often adopt this model.
Derived Object
An object ID deterministically derived from a parent object and business key. Scenarios like KillMail and registry sub-objects use this to ensure the business ID -> on-chain object ID mapping is stable and non-repeatable.
Sponsored Transaction
A transaction initiated by a player but with Gas paid by the Builder or server. EVE Vault supports sponsored transaction extensions, which is the core foundation for “users can use dApps without SUI”.
zkLogin
Sui’s passwordless login solution. After users complete OAuth login with their Web2 identity, the wallet derives the on-chain address based on ephemeral keys, salt, and proof.
Epoch
Sui’s epoch unit. zkLogin’s temporary proofs and certain caches are bound to Epochs, requiring reissuance or login state refresh after expiration.
0x6
The fixed object ID for Sui’s Clock system object. Many time-related examples in this course pass 0x6 as a parameter.
0x8
The fixed object ID for Sui’s Random system object. Examples requiring on-chain randomness typically pass this object.
LUX and SUI
Many examples in this course “use SUI instead of LUX for demonstration” to facilitate explanation in public environments and with standard SDKs. When actually integrating with EVE Frontier, use the in-game real assets and World/wallet interfaces as the standard.
GraphQL / Indexer
GraphQL mentioned in this book mostly refers to the query endpoints provided by Sui’s indexing layer; Indexer refers to off-chain retrieval services built around events and object state. They are primarily responsible for “reads”, not “writes”.
Prelude: Understanding EVE Frontier as a Game First
Objective: Before diving into contracts, components, and dApps, first understand what players in EVE Frontier are competing for, building, and why these mechanics are naturally suited to become on-chain rules.
Status: Introductory chapter. Focus is on establishing “gameplay intuition” first, so that Gate, Turret, StorageUnit, KillMail, and LocationProof won’t feel like abstract nouns in later chapters.
0.1 This Is Not a “Wallet Game with Blockchain Attached”
If you start thinking of EVE Frontier as “a space game with NFTs and tokens,” you’ll likely find things increasingly awkward as you learn more. Because the true core of this game isn’t about issuing assets—it’s a persistent, resource-competitive, geographically-constrained, player-conflict-driven open world.
It’s closer to this combination:
| Dimension | What EVE Frontier is More Like | Why This Matters |
|---|---|---|
| World Structure | A persistent space sandbox | The world doesn’t pause when you log off; facilities, routes, control zones, and economic relationships continue changing |
| Survival Pressure | Starts from “staying alive”, not “daily check-in rewards” | Resources, fuel, transport, security, and location are real problems |
| Player Relationships | Long-term cooperation and conflict coexist | You’ll need alliances, supply chains, corridors, defense, diplomacy, and retaliation |
| Building Significance | Buildings are infrastructure that changes gameplay, not decoration | Stargates, turrets, storage facilities directly affect who can pass, who can access cargo, and who gets attacked |
| Blockchain’s Role | Public rules layer and asset layer | The focus isn’t moving all gameplay on-chain, but making the parts worth public verification into programmable rules |
So you can start by remembering this:
The “unit of fun” in EVE Frontier isn’t a single NFT, but the long-term strategic interplay between player groups around infrastructure, resource flows, and territorial control.
0.2 What Does a Typical Player Actually Do in This World?
From a gameplay perspective, player activities usually revolve around this main loop:
Enter the world
-> Establish character and identity
-> Find a safe foothold
-> Acquire resources, items, and fuel
-> Build your own base or connect to others' facilities
-> Transport, trade, charge fees, defend, or raid
-> Lose, recover, rebuild, and upgrade through conflicts
This isn’t a linear quest chain, but a repeating cycle of operation and conflict. Players may focus on different playstyles:
- Survival-focused players: Prioritize supply, safety, and sustainable presence
- Industrial players: Focus on storage, logistics, item circulation, and markets
- Military players: Care about turrets, defensive lines, friend-or-foe identification, KillMail, and combat losses
- Operator players: Focus on toll gates, permission systems, alliance collaboration, and service pricing
- Builder/Operator players: Focus on turning infrastructure into fee-charging, filtering, incentivizing, and auto-executing rule systems
This book targets the last type, but what you design ultimately serves the first four types, so you must first understand what problems they actually encounter in the game.
From a Player’s Perspective, What Does a Typical Day Look Like?
If we compress gameplay into a more concrete daily loop, many actions happen like this:
Log in
-> Check current location and base status
-> Verify fuel, inventory, access permissions, nearby risks
-> Decide whether to mine, transport, trade, defend, or attack today
-> Use Gate, Storage, market, or alliance facilities to achieve goals
-> May encounter interception, tolls, turret checks, or PvP en route
-> Successfully bring back gains, or regroup resources after losses
In this daily loop, almost every step can be influenced by Builders:
- “Can I safely go out” encounters
Gate - “Check inventory and supplies” encounters
Storage Unit - “Is my base still operational” encounters
Network Node / Energy / Fuel - “Will I get intercepted on the route” encounters
Turretand regional governance - “Can losses from losing fights be tracked” encounters
KillMail - “Is someone actually physically present” encounters
LocationProof
In other words, Builders aren’t just making extra web pages—they’re integrating into decision points players encounter daily.
What Do Players Really Weigh Repeatedly in This World?
From a gameplay perspective, many choices in EVE Frontier ultimately come down to these 4 trade-offs:
| What Players Weigh | Typical Questions |
|---|---|
| Profit vs Risk | This route earns more, but is it more vulnerable to attack? |
| Convenience vs Control | Letting everyone use my facility earns more, but will I lose filtering power? |
| Liquidity vs Security | Goods in public nodes are easier to sell, but more prone to issues? |
| Short-term Gain vs Long-term Order | Charging high tolls today feels great, but will it drive everyone away and cause route decline? |
This is also why many rules you’ll see later don’t look like “button features” but more like institutional design. Fees, permissions, whitelists, insurance, deposits, payouts, and rewards are essentially tuning these contradictions.
The Same World Looks Completely Different to Different Players
If you want to build truly useful Builder products, you must realize: the same gate, turret, or warehouse has completely different value to different people.
| Perspective | What They See First | What They Really Care About |
|---|---|---|
| New players | Is it safe, will I get lost, will I die immediately upon leaving | Surviving, avoiding mistakes, not getting scammed, knowing what to do next |
| Merchants / Logistics players | Route stability, warehouse usability, predictable fees | Cost, timeliness, wastage, inventory safety |
| Pirates / Raider players | Which routes have people, cargo, and vulnerabilities | Interception profit, ambush efficiency, target screening, escape cost |
| Alliance operators | Which nodes must be defended, which routes must be open, which facilities are worth long-term investment | Regional order, taxation, logistics resilience, defensive zone stability |
| Builder / Operators | Which nodes can become rule entry points, toll points, data entry points | Rule enforceability, operating costs, user conversion, long-term reusability |
This table is important because it explains why the same facility generates different demands:
- Newcomers want gates with “fewer restrictions”
- Operators want gates with “stronger screening and governance capabilities”
- Merchants want gates with “transparent pricing, stable passage”
- Pirates want gates that “create congestion and exposure”
A Builder’s job isn’t to satisfy just one party, but to consciously decide which side your product stands with.
Viewing the Same Base Through Five Typical Identities
Suppose there’s a base built on a route node with Network Node + Gate + Turret + Storage Unit. Different people think completely differently when entering:
1. Newcomer
When a newcomer enters this base, their first reaction usually isn’t “how elegant this rule system is,” but:
- Will I get attacked
- Do I need to pay to pass through the gate
- Will my stuff be lost if I store it here
- Can I understand what this system is asking me to do
For newcomers, a good Builder system often has these characteristics:
- Clear rules
- Low failure cost
- Few erroneous operations
- Clearly explain “why you were rejected”
So many things you think are “UX copy” are actually part of gameplay retention.
2. Merchant / Logistics Player
Merchants won’t first look at whether this base is “cool”—they calculate:
- How much time does passing through this Gate save versus detouring
- Are tolls stable
- Can the Storage Unit safely store cargo temporarily
- Can turrets ensure basic safety for high-value cargo transport
If a base makes merchants form the expectation that “although expensive, it’s stable and reliable,” it might gradually become a trading node. For merchants, predictability itself is product value.
3. Pirate / Raider
Pirates see not services, but vulnerabilities and traffic:
- Is this route a must-pass
- Does the gate entrance create queues and slowdowns
- Which players will linger because of paying, opening boxes, or trading
- Can turrets be circumvented, baited, or exploited
This perspective forces Builders to rethink security issues. Many systems aren’t just “as long as the function runs”—you must ask:
- Will it create fixed ambush points
- Will it expose high-value users
- Will it allow certain playstyles to become overly stable farming machines
4. Alliance Operator
Alliance operators look at sustained order:
- Is this base worth long-term defense
- Are power and maintenance costs manageable
- Can gate access rules distinguish allies, visitors, and hostiles
- Can KillMail and passage data help judge defense zone quality
For them, facilities aren’t one-time interaction tools but part of territorial institutions. If Builders only provide one-off functions without sustained operational perspective, products will struggle to enter these players’ long-term workflows.
5. Builder / Operator
Builders and Operators usually see things more “institutionally”:
- Can this ruleset scale
- Which parts should be on-chain, which parts off-chain
- Is there a way to reduce customer service explanation costs
- Can it accumulate data, build reputation, create reusable templates
This perspective views a base as a replicable business model, not a collection of scattered functions.
0.3 What Are the Most Critical “Game Objects” in This World?
Character
Character is your core identity in the game. You can think of it as “you in the game,” but in EVE Frontier, it also assumes a very special responsibility: it’s the central hub for on-chain permissions and asset control.
Characters have at least three layers of meaning:
- It’s the character identity in the game
- It’s the on-chain
Characterobject - It’s also the “keychain” holding many
OwnerCaps
Later when you learn about OwnerCap, Receiving, and borrow-use-return, if you don’t first know that characters in gameplay are “the carrier of player control,” you’ll easily find the whole design overly convoluted.
Tribe
Tribes can initially be understood as the character’s initial affiliation or identity tag. It doesn’t necessarily equal permanent political faction, but it’s often used for:
- Newbie protection
- Gate access conditions
- Passage permissions
- Faction identification
- Gameplay segmentation
So when you later see “only allow certain tribe through stargate” or “determine turret attitude by tribe,” don’t treat it as just a random u32 field—in gameplay it carries identity classification.
Items
Items are resources, equipment, loot, keys, licenses, and economic carriers in the game. What’s most important in gameplay isn’t whether “it’s an NFT,” but whether it can participate in these actions:
- Be carried by characters
- Be deposited in storage facilities
- Be traded, rented, collateralized, synthesized, destroyed
- Become loot or loss upon player death
- Become a ticket, deposit, or consumable for certain services
This is also why StorageUnit is so important in EVE Frontier. You’re not simply making an on-chain warehouse but controlling “how items flow” in the game.
Bases and Facilities (Assemblies)
Players don’t just survive on wallets and characters—they build facility networks around spatial locations. Facilities are true gameplay amplifiers because they turn “one player’s action” into “one region’s rules.”
What Roles Do High-Frequency Terms Actually Play?
The table below tries to explain terms you’ll frequently see using “gameplay language” rather than “source code language”:
| Term | What It Is in the World | Role It Plays |
|---|---|---|
Character | Player’s on-chain character identity | Starting point for many permissions, facility control rights, and interaction qualifications |
Wallet / EVE Vault | Player’s wallet and identity container for initiating on-chain actions | Responsible for signing, holding coins, connecting dApps, but not equal to complete game identity |
Tribe | Identity classification the character belongs to | Often used for faction, whitelist, passage, and newbie protection logic |
Item | Resources, equipment, licenses, loot, etc. | Foundation material for logistics, trading, insurance, rental, and reward systems |
Assembly | Facility objects players deploy in the universe | Amplifies individual behavior into regional rules, the actual landing node for gameplay |
Network Node | Base power core | Determines whether a base can carry more facilities, prerequisite for “can it operate” |
Energy | Power capacity / quota | Determines how many facilities a base can have online simultaneously |
Fuel | Continuously consumed operational resource | Determines how long facilities can stay online, operational cost |
Gate | Space passage and jump entrance | Affects routes, fees, access control, and regional traffic |
JumpPermit | One-time or time-limited passage license | Makes “can pass through gate” into explicit rules and assets |
Turret | Automatic defense facility | Responsible for screening and punishing targets approaching certain areas |
Storage Unit | Warehousing and item circulation node | Foundation for markets, consignment, rental, prize pools, loot management |
Location | Spatial position expression of an object in the world | Makes “where this thing is” into state referenceable by rules |
LocationProof | Server-issued certificate proving “you’re actually present” | Brings offline spatial facts on-chain, avoiding remote abuse |
KillMail | Public record of a kill or combat loss | Makes conflicts and losses into indexable, rewardable, statistical facts |
OwnerCap | Control credential for an object | Determines who’s qualified to configure and manage certain facilities or key objects |
AdminACL | Server authorization whitelist | Allows only trusted backends to write certain high-permission world states |
Extension | Extended logic Builder writes for facilities | Determines how facilities charge, grant access, screen, consume, and respond |
If you want to remember these terms more deeply, use a simpler classification:
Character / Wallet / TribeThis group solves “who you are”Item / Storage Unit / LogisticsThis group solves “how things flow”Gate / Turret / LocationThis group solves “who can enter, who can stay, who gets attacked”Network Node / Energy / FuelThis group solves “can facilities continue operating”KillMail / LocationProof / AdminACLThis group solves “which facts are worth public verification”OwnerCap / ExtensionThis group solves “who has authority to change rules, how rules get attached”
Looking Deeper, What Problems Does Each Term Actually Solve for the World?
If you only memorize “definitions,” these terms still scatter easily. A more useful approach is: what long-term problems does each solve for this world?
| Term | The Real Problem It Solves for the World |
|---|---|
Character | Centralizes “who the player is and what they can control” to a stable identity node |
Wallet / EVE Vault | Reduces Web2 login, signing, and on-chain interaction friction, allowing more players to enter the rule system |
Tribe | Provides the most basic layer of social grouping, giving access control, protection, and screening natural handles |
Gate | Makes paths no longer naturally given, but governable, fee-chargeable, and institutionalizable |
JumpPermit | Turns “allowed to pass” from verbal rules into verifiable, time-expirable credentials |
Turret | Gives regional rules automatic enforcement consequences, not just UI prompts |
Storage Unit | Makes item control and circulation order stably orchestrable |
Network Node | Makes base expansion face capacity and power realities, not unlimited facility stacking |
Energy | Makes “how many facilities can be mounted” a clear resource constraint |
Fuel | Gives facility operation sustained cost, forcing management and resupply to happen |
LocationProof | Makes “must be physically present” rule-verifiable |
KillMail | Makes combat losses, kills, and war results leave publicly verifiable traces |
OwnerCap | Makes “object control” not just an address field, but a borrowable, constrainable capability object |
AdminACL | Lets the world recognize certain off-chain backend inputs, without fully opening high permissions to everyone |
Extension | Lets Builders change rules without directly rewriting world kernel |
When designing a new product later, you can ask in reverse:
- Are you solving a “path problem” or “permission problem”?
- An “item circulation problem” or “presence verification problem”?
- A “regional consequence enforcement problem” or “combat loss recording problem”?
This way you’ll more easily find which World capabilities to interface with, rather than randomly piling modules.
Why Do Many Beginners Mix Up These Terms?
Because these terms all look like “on-chain concepts,” but they actually come from different layers:
| Layer | Typical Terms | What Questions They Answer |
|---|---|---|
| Identity Layer | Character, Wallet, Tribe | Who’s acting? Whose character? |
| Asset Layer | Item, Storage Unit, KillMail | What things are being held, lost, transferred, recorded? |
| Spatial Layer | Gate, Turret, Location | Where can you go, where can you stay, will you be attacked? |
| Operational Layer | Network Node, Energy, Fuel | Can facilities stay online continuously? |
| Permission Layer | OwnerCap, AdminACL, Extension | Who can configure, who can modify, who can represent server writes? |
As long as you separate these layers first, many concepts won’t pile up together later.
Another Set of Easily Confused Terms
These terms are often mentioned together, but they’re actually not the same:
| Easily Confused Terms | Real Difference |
|---|---|
Character vs Wallet | Wallet handles signing and holding on-chain assets, Character is more like game identity and control hub |
Gate vs JumpPermit | Gate is the infrastructure itself, JumpPermit is a credential allowing you to pass through it |
Energy vs Fuel | Energy is capacity quota, Fuel is continuously consumed resource |
Storage Unit vs Item | Former is circulation container and rule entry point, latter is the object being moved, traded, locked |
OwnerCap vs AdminACL | Former leans toward object-level control, latter toward server-level high-permission whitelist |
Location vs LocationProof | Former is position state, latter is proof that “you indeed currently meet certain location conditions” |
KillMail vs Event logs | KillMail is an indexable and reusable combat loss fact object, not just one-time broadcast message |
0.4 Why Are Smart Assemblies the Gameplay Core?
One of EVE Frontier’s most special features is making infrastructure into player-ownable, configurable, extensible Smart Assemblies.
You can first understand them as “service nodes in space”:
| Facility | Role in Game | Significance to Builder |
|---|---|---|
Network Node | Base power core, facility networking starting point | Determines which facilities can come online, affects entire base capacity ceiling |
Gate | Controls passage, jumps, routes | Can do tolls, whitelists, tickets, quest gates, alliance-exclusive routes |
Turret | Defense and automatic attacks | Can do defense zone rules, threat screening, automatic security |
Storage Unit | Warehousing and item circulation | Can do stores, rental, consignment, prize pools, quest delivery points |
The value of these facilities isn’t that “they’re on-chain objects,” but that they directly control players’ most real needs:
- Can I pass through here?
- Can I safely store cargo here?
- Will I be shot by the turrets here?
- Can I buy, rent, or deliver certain items here?
Once a facility controls these entry points, it naturally becomes an economic node, strategic node, or political node.
0.5 Why Can’t Bases Avoid Network Node, Energy, and Fuel?
Many people first encountering EVE Frontier instinctively think of facilities as “on-chain objects you can use once placed.” Actually not. Facilities in gameplay have survival costs and operational conditions.
A minimal base roughly looks like this:
First find an anchorable location
-> Establish Network Node
-> Refuel the node and bring it online
-> Connect Gate / Turret / Storage Unit
-> These facilities reserve Energy from the network node
-> They rely on Fuel and online status to continue operating
Here there are at least three completely different constraints:
- Geographic constraint: You must first have a foothold in space
- Capacity constraint: How much Energy the node can provide determines how many facilities you can mount
- Consumption constraint: Fuel burns over time, facilities aren’t permanently cost-free online
This means “why can’t I use my gate” in gameplay could be three completely different problems:
- This gate isn’t connected to an available node at all
- Node capacity is insufficient, Energy is occupied by other facilities
- Fuel burned out, building entered offline state
This is also why the later Energy / Fuel chapters aren’t just technical details. They directly determine whether a base is a truly operable, fee-chargeable, sustainable system.
Why Does a Base Naturally Grow Into a “Service Network”?
Many people think bases are just “placing several buildings together.” But in EVE Frontier, a mature base is more like a small institutional system:
Network Nodeprovides underlying power capabilityGatedetermines who can enter, exit, detour, or payTurretdetermines safety consequences for those approachingStorage Unitdetermines how cargo, supplies, and deposits are managed
Once these 4 types of facilities combine, bases no longer just become footholds but gradually evolve into:
- Toll outposts
- Alliance defense zones
- Logistics transit stations
- Border trading ports
- War supply nodes
So what Builders actually write is often not individual functions, but rule networks for “how a certain region should operate.”
What “Secondary Gameplay” Usually Emerges Once a Base Matures?
Initially bases are just tools for survival, but once stabilized, many secondary gameplays emerge:
- Fee gameplay Like passage fees, storage fees, agent fees, rush fees
- Screening gameplay Like whitelists, memberships, tribe-exclusives, post-quest access
- Security gameplay Like gate entrance turret linkage, critical cargo warehouse defense zones, danger list auto-identification
- Financial gameplay Like deposits, insurance, rental, payouts, bounties
- Social / Political gameplay Like alliance-exclusive corridors, diplomatic treaties, regional joint defense, war zone access systems
This is also why a base that’s “just gates and turrets” eventually becomes something like a hybrid of port, checkpoint, market, and border station.
0.6 Why Are Stargates, Location, and Spatial Control So Important?
EVE Frontier is a strongly spatial world. You’re not just “clicking a button to go to a page,” but moving in a universe with distance, paths, and risk exposure.
What Does Gate Mean in Gameplay?
The essence of Gate isn’t teleportation effects, but a passage rights controller. Who can pass, when they can pass, how much they pay, whether they meet certain conditions—all affect player behavior paths.
So a stargate can naturally evolve into many gameplay forms:
- Toll stations
- Whitelist corridors
- Reward entrance after quest completion
- Alliance-exclusive routes
- Risk zone customs
More specifically, Gates in gameplay control at least 4 things simultaneously:
- Path choice Whether players are willing to take this route depends on whether it’s fast, stable, and cheap.
- Access qualification Who can pass, who must first meet conditions, who gets rejected.
- Passage cost Can this become a fee node, monthly card node, alliance privilege node.
- Regional rhythm Once a gate is fee-charged, blockaded, or militarized, surrounding logistics and conflict distribution change.
So Gates are never “a jump button,” but transportation institutions.
Why Are Gates Often the First Commercialized Facility?
Because Gates naturally sit on “must-pass paths.” Whoever controls paths controls three particularly monetizable capabilities:
- Fee capability Tolls, membership fees, temporary license fees all come naturally.
- Screening capability Who can enter, who can’t directly affects regional population and cargo flow.
- Traffic capability Players lingering at gates means surrounding storage, shops, and quest points can be activated.
So many Builders’ first truly decent commercial product starts from Gates, not from more abstract Token models.
Why Do We Need Location Proofs?
Because many actions aren’t “as long as you own a certain wallet address you can do it,” but “only if you’ve actually arrived on site.” For example:
- You must be near a certain gate to pass through
- You must be near a certain treasure chest to open it
- You must be near a certain market node to trade on-site
On-chain contracts can’t know your real-time coordinates in the game world themselves, so the game server needs to issue proof that “you’re near a certain object,” then the chain verifies it. This is the gameplay background for why LocationProof and proximity systems exist later.
You can also understand location proof as “making geographic presence a ticket condition.” Without it, many gameplays that should depend on on-site presence will degrade into remote script operations:
- Remote gate opening
- Remote chest opening
- Remote trading
- Remote submission of tasks that should be completed on-site
Once these can all be done remotely, space loses meaning, and the strategic value of bases and routes declines together.
0.7 How Do Turrets, Defense Lines, and “Regional Order” Come About?
If Gates solve “who can pass,” then Turret solves “who approaching gets shot.”
Turrets in gameplay aren’t simply damage devices but part of regional order. They turn a space from “anyone can come” into “better consider consequences before coming.” This directly changes:
- Friend-or-foe identification
- Newbie protection
- Base defense
- Corridor deterrence
- Logistics safety costs
By default turrets only provide basic defense logic, but after Builder intervention, turrets can become more complex strategic facilities:
- Prioritize attacking active aggressors
- Spare whitelists or specific tribes
- Escalate threats for high-risk targets
- Work with gates and fees to govern entire regions
So don’t only view turrets as combat modules. They’re essentially spatial governance modules.
In mature regional gameplay, turrets often work together with other facilities:
Gate + TurretForms fee-charging border ports with enforcement powerStorage Unit + TurretForms defense system for high-value material nodesLocationProof + TurretForms regional rules where “only on-site and condition-meeting people can safely operate”
In other words, turrets aren’t just damage output but “consequence machines when rules aren’t followed.”
Why Do Turrets Directly Change Economics?
Because safety is never a free environmental variable but part of transaction costs. As long as turrets exist, these things change:
- Whether cargo players are willing to take certain routes
- Whether high-value cargo dares to temporarily store at certain nodes
- Whether fees in certain regions can be enforced
- Whether pirate interception costs rise
So turrets aren’t just “things military players care about.” They actually affect decisions of merchants, operators, and ordinary passing players.
0.8 Death, Loot, KillMail, and “Visibility of Losses”
EVE-style gameplay has a fundamental difference from many lightweight blockchain games: losses aren’t abstract numerical drops, but real asset, location, and opportunity losses.
A PvP or wrong navigation can bring these results:
- Ships or facilities destroyed
- Items lost
- Loot picked up by others
- Certain transport line forced to interrupt
- Opponent obtains a public KillMail record
KillMail’s significance isn’t just “making a combat rankings look good.” It has at least 4 roles in gameplay:
- Makes losses publicly verifiable
- Makes combat history part of economic and reputation systems
- Lets Builders design rewards, insurance, bounties, leaderboards around kill records
- Makes conflicts in the world leave lasting traces
This is also why you’ll later see designs around KillMail for insurance, bounties, achievements, and statistics. Because it’s not marginal logs but part of war narratives.
If we speak more directly, KillMail makes many things in this world institutionalizable for the first time:
- “Is this loss real” can be verified
- “Is this person recently a high-risk target” can be statistically tracked
- “Should insurance pay out” doesn’t rely on hearsay
- “Is this alliance defending routes” can be indirectly reflected through combat records
So it’s not simply combat reports but public baseline for warfare, insurance, reputation, and bounty systems.
Why Does “Death Has Records” Profoundly Change Player Behavior?
Because once losses leave public traces, player behavior no longer just short-term results but also affects long-term reputation and institutional evaluation:
- Operators care more about route safety
- Insurers care more about whether payout conditions are real
- Bounty systems can more precisely identify targets
- Players care more about which places are prone to incidents
KillMail makes “what happened in this world” no longer just exist in chat records and memories, but enter the statistically composable rules layer.
0.9 Why Do Storage, Logistics, and Trading Become an Entire Economy?
In EVE Frontier, items aren’t “done once in backpack.” What’s truly valuable is how items are stored, transported, exchanged, and permission-accessed.
An item from production to consumption may go through many stages:
Player obtains item
-> Temporarily carry
-> Deposit in Storage Unit
-> Listed for sale / as deposit / as rental target
-> Purchased, rented, redeemed, or consumed by other players
-> Recirculated in war, transport, or death
Thus storage facilities no longer just warehouses but naturally evolve into:
- Stores
- Vending machines
- Consignment points
- Shared warehouses
- Reward pools
- Loot recovery points
- Inter-alliance material exchange interfaces
From a gameplay perspective, where Builders most easily create value is often not “issuing a new Token,” but making existing item circulation paths smoother, safer, more rule-bound.
This is also why Storage Unit is often the most underestimated facility. Many product innovations look like markets, rentals, insurance, loot boxes, or quest systems on the surface, but underneath are answering one question:
When can certain items be taken by whom, put in, locked, released, transferred, destroyed, or redeemed?
Whoever controls this process controls a considerable portion of the game economy.
Why Is Storage Unit Often the “Behind-the-Scenes Star” of All Gameplay?
Because many systems ultimately come down to “where things are placed, who can take them, when to release”:
- Markets must solve settlement
- Rentals must solve temporary transfer and expiry recovery
- Reward pools must solve conditional release
- Insurance must solve payout material or fund distribution
- Quest systems must solve delivery and pickup item sequence
So although Storage Unit doesn’t look as conspicuous as Gate on the surface, it’s often the facility closest to the economic base.
0.10 In the Economic System, What Roles Do SUI and LUX Play?
In this book’s context, you can first make a sufficiently practical understanding:
- SUI is more like underlying on-chain fuel and general settlement asset
- LUX is more like assets closer to game services and daily economic interactions
Many Builder products will design around these actions in gameplay:
- Passage fees
- Service fees
- Deposits
- Rental settlement
- Insurance premiums and payouts
- Reward distribution
What’s truly important isn’t “which Coin is more advanced,” but whether your designed fee model fits gameplay. For example:
- High-frequency small-amount services care more about low friction
- Risk scenarios need more deposits and breach penalties
- Long-term relationships suit subscriptions, memberships, license models better
- War-related products value payouts, guarantees, and reputation more
So economic design is never an independent chapter. It’s always embedded in gameplay like passage, defense, warehousing, logistics, and diplomacy.
0.11 Which Things Are On-Chain, Which Things Still on Game Server?
The most critical point to understanding EVE Frontier is not to mix “open world game” and “on-chain contracts” into one system.
| Better Suited for Game Server Processing | Better Suited for On-Chain Processing |
|---|---|
| Real-time position, physics simulation, combat process | Asset ownership, permission objects, license rules, fee settlement |
| High-frequency state refresh, instant judgments | Verifiable records, long-term state, open composition interfaces |
| In-game coordinates and real-time observation | KillMail, OwnerCap, JumpPermit, configuration objects |
Both sides are collaborative, not replacement relationships.
You can understand it as:
- Game server handles “what’s happening in the world”
- Blockchain handles “which rules and results need to be public, persistent, composably established”
Precisely because of this, what Builders write is more like “rule infrastructure,” not taking over the entire game engine.
If this layer of division is misunderstood, two common misconceptions will arise later:
- One misconception is “why not move all real-time gameplay on-chain” Real-time physics and high-frequency judgments aren’t suitable to be directly made into public chain state
- Another misconception is “since the server knows, why put it on-chain” Because permissions, assets, licenses, payouts, public records—these things precisely need public verification
EVE Frontier’s special feature isn’t extremely choosing only one side, but piecing together the parts each side excels at into a stronger system.
As a Builder, 10 Types of Gameplay Entry Points Most Worth Observing First
If you want to transition from “knowing the worldview” to “knowing what to do,” the most worthwhile first observations are these entry points:
- Who has natural traffic on a certain route.
- Which locations require players to physically be present.
- Which items are frequently deposited, withdrawn, traded, bet on, or consumed.
- Which rules need automatic enforcement, not just verbal agreements.
- Which losses need public records to support subsequent payouts or statistics.
- Which regions already have stable but inefficient fee or screening demand.
- Which services can’t scale because they need manual trust.
- Which alliances or groups need to long-term maintain their own order and boundaries.
- Which players abandon certain trades or collaborations due to excessive operational costs.
- Which places, once rules are standardized, can be replicated to many bases and routes.
These 10 types of entry points are essentially where Builders most easily discover real needs.
0.12 What Can Developers (Builders) Do in This World?
After understanding players’ game loops, as a developer (Builder), you can leverage Sui smart contracts and EVE Frontier’s open interfaces to excel in four core directions:
1. Write Smart Component Extensions
In-game stargates (Gate), turrets (Turret), and storage boxes (Storage Unit) only have the most basic operational logic by default. Builders can write Move contracts to turn these facilities into complex business rule engines:
- Stargate toll stations: Charge per use, provide monthly card subscriptions, or charge high “tolls” for hostile alliances.
- Smart fire control networks: Let turrets identify “wanted criminals with bounties” or “passersby who haven’t paid protection fees” and automatically fire.
- Automated logistics boxes: Players deposit ore objects, contract automatically settles LUX tokens at current market rate.
2. Build Decentralized Applications & Interfaces (dApps & UI)
With on-chain rules, players need convenient user interfaces to interact. You can develop using EVE Vault (player digital identity plugin) combined with frontend and dApp Kit:
- Ticket sales hall: A webpage for players to purchase “Jump Permits” online.
- Alliance finance dashboard: Shows shared treasury real-time balance, daily tax revenue, and loot distribution records.
- In-game embedded overlay: Web UI that pops up directly in-game, letting players complete interactions and signing without leaving the client.
3. Design Advanced Deep Space Finance & Economic Models (DeFi)
Since all items (ships, weapons, resources) can be mapped on-chain as unique Objects, EVE Frontier naturally suits breeding hardcore financial protocols:
- Combat loss insurance contracts: Verify exact combat losses based on kill logs (KillMail), automatically payout to exploded ship players.
- Decentralized rental protocols: Mechanisms with great player safety to temporarily borrow premium firepower equipment (like turrets), automatically forfeit deposit or revoke control if not returned on time.
- Futures & options markets: Combine major corporation war zone resource output rates to establish large-depth trading pools (like DeepBook integration) to lock mineral prices.
4. Data Intelligence & Intelligence Networks (Data & Intel Indexing)
Every on-chain event broadcasts real-time changes in the game world. Builders can use indexers or backend monitoring to build ecosystem intelligence tools:
- Interstellar warfare heat maps: By aggregating network-wide jump permit records and
KillMaildata, real-time prompt which star systems are erupting in warfare, becoming frontline high-risk areas. - Network-wide bounty hunter red name list: Record each player’s malicious breach, hostile kill behavior and other on-chain footprints, letting mercenary alliances form quantifiable reputation ratings.
In short, ordinary players are experiencing this digital universe’s cruelty and grandeur, while Builders are directly writing and distributing this universe’s fundamental operating laws.
0.13 Using a Complete Scenario to String These Concepts Together
Suppose a Builder team operates a outpost base on an important route:
- They first establish
Network Node, giving the entire base power capability. - Then deploy a
Gate, turning this route into a fee-chargeable corridor. - Deploy
Turretnear the gate to prevent hostile players from freeloading or harassing. - Add a
Storage Unit, letting passing players buy supplies, store materials, pay deposits. - To avoid anyone passing through the gate, they write an extension for Gate:
- Whitelist alliance members pass free
- Ordinary players pay tolls
- High-risk area players must hold temporary permits
- To handle “only allowing trades if person is on-site” gameplay, they require market operations to attach location proof.
- If nearby kills occur, KillMail gets indexed and used to give combat rewards to security alliances.
You’ll find this already strings together most major themes of the entire book:
Characterand permissionsGate / Turret / StorageUnitEnergy / FuelLocationProofKillMail- Fees, licenses, insurance, rewards
- dApp display and off-chain indexing
So each chapter later isn’t actually about “an abstract technical point,” but dissecting certain types of real gameplay needs in this world.
0.14 After Reading This Chapter, What Should You Bring Into Subsequent Chapters?
If this chapter has established these intuitions for you, later content will go much smoother:
- EVE Frontier’s core isn’t issuing assets but operating infrastructure and rule entry points.
- Facilities are gameplay nodes, not decorations.
- Location, passage, warehousing, defense, combat losses, and economics are interconnected.
- Game server and on-chain contracts have clear division of labor but collaborate on key rules.
- Builders write not “plugin features” but rule systems that may long-term embed into world order.
Next recommend entering in this order:
- Chapter 1: EVE Frontier Macro Architecture
- Chapter 2: Development Environment Setup
- Chapter 3: Move Contract Basics
When you read further into these chapters later, you can always come back to this one:
- Chapter 16: Location and Proximity Systems
- Chapter 32: KillMail System
- Chapter 29: Energy and Fuel Systems
- Chapter 30: Extension Pattern Practice
- Chapter 31: Turret AI Extensions
- Chapter 26: Complete Access Control Analysis
Chapter 1: EVE Frontier Macro Architecture and Core Concepts
Objective: Understand what EVE Frontier is, why it chose the Sui blockchain, and the core philosophy of a “programmable universe.”
Status: Foundation chapter. The main text focuses on macro architecture and terminology establishment, suitable as an entry point for the entire book.
If you haven’t formed a clear intuition about the game itself, we recommend reading Preface Chapter: First Understand the EVE Frontier Game.
1.1 Why is EVE Frontier Different?
Traditional online games have world rules dictated solely by developers—economic systems, combat formulas, content updates—players are merely participants. EVE Frontier challenges this paradigm: the game’s core mechanics are open, and developers (Builders) can truly rewrite and extend game rules within the framework defined by the game server.
This isn’t simply “MOD plugins”—the logic you write runs as smart contracts on the Sui public blockchain, permanently auditable, requiring no centralized server hosting, and executing automatically 7×24.
What is it NOT?
Beginners most easily confuse EVE Frontier with the following things, but it’s not entirely equivalent to any of them:
| Easily Confused With | Why Similar | Why Different |
|---|---|---|
| Traditional MOD / Plugin Systems | Both allow third-party extension of game logic | MODs typically run on centralized servers or clients; EVE Frontier’s key state and rules can be on-chain, auditable, and composable |
| Private Server Script Systems | Both can modify default gameplay | Private server scripts are usually controlled unilaterally by operators; Builder contracts can form a public, verifiable rules market |
| Ordinary Blockchain Game Contracts | Both have NFTs, Tokens, markets | EVE Frontier’s focus isn’t on individual asset contracts, but on turning game infrastructure like “stargates, turrets, storage units” into programmable objects |
| Pure On-chain Games | Both emphasize on-chain rules | EVE Frontier still retains game servers, physics simulation, and a real-time world, so it’s a hybrid system of “on-chain rules + game server collaboration” |
You can understand it as:
EVE Frontier isn’t about “putting the entire game on-chain,” but rather putting the sufficiently important, sufficiently composable, sufficiently worthy of public verification portions of game rules on-chain.
Three Player Roles
| Role | Primary Actions |
|---|---|
| Builder (Constructor) | Write Move contracts, deploy smart components, build dApp interfaces |
| Operator (Manager) | Purchase/own facilities, configure Builder modules, manage economic factions |
| Player | Interact with facilities built by Builders/Operators, forming the game world |
The target audience of this course is Builders, but understanding the other two roles helps you design more valuable products.
How Do These Three Roles Interact?
Many people initially think these three roles are completely separate. Actually, they’re not—they describe three types of responsibilities within the same ecosystem:
- Builder is responsible for “defining rules” Example: Write a toll stargate, rental market, alliance dividend system
- Operator is responsible for “operating rules” Example: Actually buy facilities, set rates, issue passes, maintain inventory
- Player is responsible for “consuming rules” Example: Buy tickets, rent equipment, pass turret checks, claim rewards
A minimal business chain typically looks like this:
Builder writes contract
-> Operator deploys and configures facility
-> Player interacts with facility
-> On-chain state changes are consumed by dApp / other Builders
This is why Builders can’t just know how to write contracts. You also need to understand:
- What Operators care about: revenue, permissions, security, maintenance costs
- What Players care about: price, convenience, predictability, whether they’re being scammed
What Does a Minimal Builder Closed Loop Look Like?
If you compress the EVE Builder ecosystem into a minimal closed loop, it’s typically this chain:
Builder designs rules
-> Deploys facilities and extensions
-> Operator configures parameters and operates
-> Player pays or meets conditions to use
-> Transactions, permissions, and asset changes land on-chain
-> Front-end and indexing layer display results
Every link in this chain is essential:
- Without Builders, there are no new rule facilities in the world
- Without Operators, facilities lack sustained managers
- Without Players, rules won’t form real economic activity
So when designing any component later, it’s best to first ask yourself three things:
- Who defines the rules?
- Who operates and maintains it?
- Why would players be willing to use it?
1.2 Smart Assemblies: Programmable Space Infrastructure
Smart Assemblies are physical facilities built by players in space in EVE Frontier. They are both game objects and programmable contract objects on the blockchain.
More precisely, a smart assembly typically has three layers of identity simultaneously:
- Physical facility in the game world Example: You can actually see a turret or stargate in space
- Shared object on-chain Example: It has an object ID, state fields, permission rules
- Service entry point accessible to dApps Example: Front-end can query its inventory, rates, online status, and initiate transactions
So when you say “I made a smart stargate,” you’re essentially not just making a UI, nor just writing a Move module, but creating:
An infrastructure service that’s visible in-game, verifiable on-chain, and operable from the front-end.
Main Component Types
🏗 Network Node
- Anchored at Lagrange Points
- Provides Energy for the entire base
- All facilities must connect to a network node to operate
- Not directly programmable, but the operational foundation for other components
📦 Smart Storage Unit (SSU)
- Stores items on-chain, supports “main inventory” and “Ephemeral Inventory”
- By default, only allows Owner to deposit/withdraw items
- Through custom contracts can become: vending machines, auction houses, guild vaults
⚡ Smart Turret
- Automated defense facility
- Default behavior is standard attack logic
- Through contracts can customize target locking judgment logic (e.g., only attack characters without permits)
🌀 Smart Gate
- Links two locations, allows character jumps
- By default, everyone can jump
- Through contracts introduces “Jump Permit” mechanisms, enabling whitelists, fees, time limits, etc.
Four Most Common Builder Transformation Directions for Components
| Component | Default Capability | Most Common Builder Transformations |
|---|---|---|
| Network Node | Provides power and network foundation | Generally don’t directly modify logic, but build upper-layer business around energy/network status |
| Storage Unit | Store/retrieve items | Shops, auctions, rentals, quest storage, alliance vaults |
| Turret | Auto-attack | Whitelists, paid protection, combat event linkage, priority AI |
| Gate | Allow jumps | Fees, permits, quest thresholds, faction/camp filtering |
If you’re unsure which component to start from for an idea, first ask yourself:
- Is it fundamentally about “storing things”? Start with
Storage Unit - Is it fundamentally about “deciding who can pass”? Start with
Gate - Is it fundamentally about “deciding who gets attacked”? Start with
Turret - Does it fundamentally depend on power/network constraints? Need to understand
Network Nodesimultaneously
Smart Assembly Lifecycle
A smart assembly isn’t “done once deployed on-chain”—it typically goes through an entire lifecycle:
- Creation / Anchoring Facility is first established in the world
- Attribution It’s bound to a character or operating entity
- Online It obtains energy, network, and interactive state
- Extension Builder plugs in custom rules
- Operation Operator adjusts prices, inventory, permissions
- Consumption Player has real interactions with it
- Offline / Migration / Deactivation Facility may lose energy, upgrade, be replaced, or cease operations
The contracts, dApps, scripts, wallets, and indexing you’ll learn later all serve around this lifecycle.
1.3 Three-Layer Architecture: How is the Game World Built?
EVE Frontier’s world contracts use a strict three-layer architecture, which is key to understanding all subsequent content:
┌────────────────────────────────────────────────────┐
│ Layer 3: Player Extensions (Player Extension Layer) │
│ Your Move contracts are here │
└────────────────┬───────────────────────────────────┘
│ Invoked via Typed Witness Pattern
┌────────────────▼───────────────────────────────────┐
│ Layer 2: Smart Assemblies (Smart Component Layer) │
│ storage_unit.move gate.move turret.move │
└────────────────┬───────────────────────────────────┘
│ Internal calls
┌────────────────▼───────────────────────────────────┐
│ Layer 1: Primitives (Basic Primitive Layer) │
│ status location inventory fuel energy │
└────────────────────────────────────────────────────┘
- Layer 1 - Primitives: Bottom-layer modules not directly callable, implementing “digital physics” (like location, inventory, fuel)
- Layer 2 - Smart Assemblies: Component objects exposed to players, each is a Sui Shared Object
- Layer 3 - Player Extensions: Where you as a Builder work, safely inserting custom logic through Typed Witness
Key Understanding: You cannot directly modify Layer 1/2, but you can write logic in Layer 3 that interacts with components through officially authorized APIs. This ensures both the safety of the game world and provides sufficient freedom for Builders.
What Does Each Layer Actually Handle?
| Layer | Responsible For | Typical Questions | How You Usually Encounter It |
|---|---|---|---|
| Layer 1: Primitives | Define lowest-level world rules | How are location, inventory, fuel, energy, state transitions represented | Usually understood through source code deep reading, not directly modified |
| Layer 2: Assemblies | Package low-level rules into facilities players can use | How gates jump, turrets shoot, storage units deposit/withdraw | Interact through official APIs, official component entry points |
| Layer 3: Extensions | Insert custom business logic without breaking the core | Who can pass gates, how much to charge, what conditions must be met before release | This is the Builder’s main battlefield |
A very practical judgment criterion:
- If you’re defining “world basic laws,” that’s usually a Layer 1 issue
- If you’re defining “how official facilities work by default,” that’s usually a Layer 2 issue
- If you’re defining “how I want my facility to work,” that’s usually a Layer 3 issue
How Does a Real Interaction Pass Through Three Layers?
Taking “player pays to pass through stargate” as an example:
Player clicks "Purchase and Jump" in dApp
-> Layer 3: Your fee extension checks if payment made / if holding ticket
-> Layer 2: Gate component executes jump entry
-> Layer 1: Underlying location, state, permissions, fuel primitives complete validation and state updates
-> Result written back to on-chain object, front-end refreshes
So when writing extensions, keep in mind to always distinguish:
- Which part is “my business rules”
- Which part is “behavior guaranteed by official components”
- Which part is “underlying world physics rules”
Why is This Layering Important for Builders?
Because it directly determines where you should write your logic.
For example, if you want to make a “toll stargate”:
- Fee rules and discount strategies: Write in your extension
- How the stargate jump itself executes: Handled by official components
- Location, permissions, state transitions involved in jumping: Handled by underlying primitives
If you mix these three things together, two common problems will occur:
- You reimplement rules in extensions that are already guaranteed at the bottom layer
- You think you can modify the official component core, but actually have no such authority
What is Typed Witness?
Here’s an intuitive understanding first, without diving into syntax details:
- You can’t just tell an official stargate “use my function from now on”
- You must access through a type identity marker accepted by officials
- This type identity is the
Typed Witnessthat will repeatedly appear later
You can roughly understand it as:
“I’m not directly modifying official code, but holding a typed authorization badge to attach my extension logic to official components.”
Later in Chapter 30 you’ll see how it works specifically.
1.4 Why Choose the Sui Blockchain?
EVE Frontier’s migration to Sui wasn’t accidental, but a carefully considered technical choice.
Sui’s Core Advantages
| Feature | Traditional Blockchain | Sui |
|---|---|---|
| Asset Model | Account balance model | Centered on Objects, each asset has unique ID and ownership history |
| Concurrent Processing | Serial execution | Independent objects can execute in parallel, extremely high throughput |
| Transaction Latency | Seconds to minutes | Sub-second finality |
| Player Experience | Need to manage mnemonic phrases | zkLogin: Login with Google/Twitch account |
| Gas Fees | User pays | Supports sponsored transactions, developers can pay |
What Does the Object Model Mean?
On Sui, every item, every character, every component in the game is an independent on-chain object, with:
- Unique
ObjectID - Clear ownership (
owned by address/shared/owned by object) - Complete traceable operation history
This is especially important for game worlds, because many game objects are naturally suited to “independent entity” representation:
- A permit is an independent ticket
- A warehouse is an independent facility
- A treaty is an independent agreement
- A kill record is an independent battle report
When these things are all objects, you can naturally do:
- Transfer
- Authorization
- Query
- Composition
- History tracking
This is why EVE Frontier can make “facilities, permissions, transactions, events” into a programmable ecosystem, rather than a pile of database records that can only be consumed internally.
This makes decentralized ownership, trading, and game history archives naturally viable capabilities.
Three Most Critical Object States
If this section isn’t explained clearly, much of the content later will feel awkward.
| Object State | Meaning | Common Examples in EVE Frontier |
|---|---|---|
owned by address | Object is directly owned by an address | NFTs in player wallets, certain credential objects |
shared | Object can be accessed by anyone under rule satisfaction | Stargates, turrets, markets, shared vaults |
owned by object | Object is held by another object | Capability objects held by characters, internal facility assets |
These three states determine almost all your later designs:
- How to write permissions
- How to assemble transactions
- How the front-end queries objects
- Whether parallel execution is possible
Why is the Object Model Particularly Suitable for Space Games?
Because space games are naturally “many discrete objects interacting”:
- Ships are objects
- Characters are objects
- Gates, turrets, storage units are objects
- Passes, policies, rental vouchers are also objects
Sui’s object model means these things don’t need to be forcibly stuffed into a centralized database table or a huge contract mapping. You can make each facility, each voucher, each relationship into independent objects, then through:
- Ownership relationships
- Shared access
- Events
- Dynamic fields
Organize them together.
Why are Sponsored Tx and zkLogin Important for Game Experience?
The two most discouraging points for players in traditional on-chain applications are:
- Need to learn about wallets, mnemonic phrases, Gas first
- Must pay transaction fees yourself for every action
Sui’s value in EVE Frontier isn’t just “higher performance,” but that it provides the foundational conditions for gradually introducing Web2 players to on-chain interactions:
- zkLogin: Lower wallet barriers
- Sponsored Tx: Lower transaction barriers
- Low-latency object transactions: Reduce interaction waiting time
These three points combined make “one click in-game completes on-chain action” a reality.
1.5 EVE Vault: Your Identity and Wallet
EVE Vault is the officially provided browser extension + Web wallet, serving as your digital identity as a Builder and player.
Core Functions
- Store LUX, EVE Token, and in-game NFTs
- Create Sui wallet through zkLogin using EVE Frontier SSO account, no need to manage mnemonic phrases
- Serve as dApp connection protocol, authorizing third-party dApp access in-game and external browsers
- FusionAuth OAuth binds game character identity with wallet
How is it Different from Regular Wallets?
Regular crypto wallets typically follow the mindset: “Have wallet first, then find applications.”
EVE Vault is more like: “I’m first a user in EVE Frontier, then wallet capability naturally embeds into this identity system.”
This means it simultaneously handles three things:
- Asset Container Holds LUX, Tokens, NFTs, credentials
- Identity Bridge Connects game account, SSO login, Sui address
- Interaction Authorizer Provides dApps with connection, signing, sponsored transaction capabilities
What Do You Need to Remember About zkLogin First?
Don’t dive into cryptographic details right away—understanding these three points is enough:
- It allows users to enter on-chain systems using familiar login methods
- Behind it still falls to a wallet identity usable on Sui
- This isn’t “no wallet,” but “wallet creation and recovery experience is repackaged”
When you reach Chapter 33, dive into its proof structure and temporary key mechanism.
Two Currencies
| Currency | Purpose |
|---|---|
| LUX | Main in-game transaction currency, used for purchases, services, fees, etc. |
| EVE Token | Ecosystem participation token, used for developer incentives, special asset purchases |
1.6 Programmable Economy: Builder’s Business Possibilities
Review what real business logic Builders can implement:
💰 Economic Systems
├── Custom trading markets (auto-matching, bidding auctions)
├── Alliance tokens (Sui-based Fungible Tokens)
└── Service fees (stargate tolls, storage rent)
🛡 Security & Permissions
├── Whitelist access control (which players can use your facilities)
└── Conditional locks (only characters who complete quests can withdraw items)
🤖 Automation
├── Turret custom locking logic
├── Automatic item distribution (quest rewards, airdrops)
└── Cross-facility linkage (facility A's behavior triggers facility B's response)
🏗 Infrastructure Services
├── Third-party dApps read on-chain state
└── External API linkage (off-chain data triggers on-chain actions)
What Does a Minimal Builder Business Closed Loop Look Like?
If you still find “what Builders actually do” a bit abstract, remember this minimal closed loop:
I control a facility
-> I define rules others must follow when using it
-> Rules written on-chain
-> Players use facility after paying/holding credentials/meeting conditions per rules
-> Revenue, permissions, credentials, history records all stay on-chain
For example:
- Toll stargate: Charge per use
- Alliance warehouse: Place items by permissions
- Quest gate: Can only enter after completing assessment
- Auction box: Sell resources by price curve
The biggest difference from “making a regular game plugin” is:
- Rules are public
- State is verifiable
- Asset flows are traceable
- Other Builders can continue composing your rules
After Finishing Chapter 1, You Should Be Able to Answer These 5 Questions
- Why isn’t EVE Frontier a regular MOD system?
- What are Builder, Operator, and Player each responsible for?
- Why is a Smart Assembly both a game facility and an on-chain object?
- In the three-layer architecture, which layer do Builders actually work in?
- Why is Sui’s object model more suitable for this type of game than traditional account balance models?
🔖 Chapter Summary
| Learning Point | Core Concept |
|---|---|
| EVE Frontier’s Positioning | Truly open programmable universe, Builders can rewrite game rules |
| Smart Assembly Types | Network Node / SSU / Turret / Gate |
| Three-Layer Architecture | Primitives → Assemblies → Player Extensions |
| Why Sui | Object model, concurrency, low latency, zkLogin frictionless experience |
| EVE Vault | Official wallet + identity system, based on zkLogin |
📚 Extended Reading
- Why Build on EVE Frontier?
- Smart Infrastructure
- EVE Frontier World Explainer
- Sui Documentation: Object Model
Chapter 2: Sui and EVE Environment Configuration
Objective: Complete only the two most fundamental and necessary installations for this book:
Sui CLIandEVE Vault. This chapter no longer expands on common development tools like Git, Docker, Node.js, pnpm.
Status: Foundation chapter. The main text only covers installations and configurations directly related to Sui and EVE Frontier.
2.1 What Does This Chapter Install?
This chapter only handles two types of installations directly related to this book:
| Tool | Version Requirement | Purpose |
|---|---|---|
| Sui CLI | testnet version | Compile and publish Move contracts |
| EVE Vault | latest version | Browser wallet + identity |
2.2 Why Only Install These Two First?
Because before you continue reading, the minimum capabilities you truly need are only two:
- Can run
suicommands locally All your Move compilation, testing, publishing, and object queries depend on it - Have a working EVE wallet identity in browser All your dApp connections, signing, and test asset claiming depend on it
Tools like Git, Docker, Node.js, pnpm will certainly be used later, but they belong to:
- Common development tools
- Scaffolding engineering tools
- Front-end and script running tools
These are better installed when you reach Chapter 6 and Chapter 7, combining them with the project directory.
What This Chapter Really Establishes Isn’t Just Two Software Programs
More precisely, this chapter establishes two working entry points:
- Command-line entry For compilation, publishing, querying, testing
- Browser entry For wallet connection, signing, dApp interaction
Almost all your development actions later will switch back and forth between these two entry points:
- After writing contracts, use CLI to compile and publish
- Open front-end, use EVE Vault to connect and sign
- When querying objects, might use CLI, or might use front-end or GraphQL
So although this chapter seems to be just about installation, it’s actually laying down the “workbench” for the entire book.
2.3 Installing Sui CLI
Recommend directly using the official suiup installation method. This way this chapter doesn’t need to distinguish between system toolchains like Homebrew, apt, nvm.
# Install suiup
curl -sSfL https://raw.githubusercontent.com/MystenLabs/suiup/main/install.sh | sh
# Reopen terminal or reload shell, then execute
suiup install sui@testnet
# Verify
sui --version
If sui --version can output the version number normally, the first step of this chapter is complete.
2.4 Initialize Sui Client
After installing Sui CLI, you need to initialize the client and connect to the network:
# Initialize configuration (first run will prompt to select network)
sui client
# Select testnet, or connect to local node:
# localnet: http://0.0.0.0:9000
# View current address
sui client active-address
# View balance
sui client balance
What Did You Actually Complete Here?
After executing sui client, you’ll have a basic on-chain identity and network configuration locally:
- Current active address
- Current default network
- RPC configuration corresponding to that network
- Account context that local CLI will use when sending transactions and querying objects
In other words, sui client isn’t simply a command to “check balance,” but laying the foundation for all your Move development actions.
What’s the Relationship Between sui client and EVE Vault?
These two things are most easily confused by beginners:
sui clientis the identity and network configuration in the command-line environmentEVE Vaultis the identity and signing entry in the browser environment
They can both represent “you,” but serve different scenarios:
- When you publish contracts, run tests, query objects in the terminal, you mainly rely on
sui client - When you connect dApps, click buttons, sign transactions in web pages, you mainly rely on
EVE Vault
Must They Be the Same Address?
Not necessarily.
Many developers will encounter this situation:
- CLI uses one test address
- EVE Vault has another zkLogin address
This isn’t absolutely wrong, but you must be very clear about:
- Which address you’re publishing packages from now
- Which address or character controls your facilities
- Which wallet address your front-end connects to
As long as these three things aren’t aligned, you’ll frequently encounter “I clearly published, why can’t the front-end see it / operate it” problems.
Get Test SUI from Faucet
If connected to testnet:
# Request test coins through CLI
sui client faucet
# Or visit web Faucet:
# https://faucet.testnet.sui.io
2.5 Install and Initialize EVE Vault
EVE Vault is your browser identity, used to connect to dApps and authorize transactions.
Installation Steps
- Download the latest Chrome extension:
https://github.com/evefrontier/evevault/releases/download/v0.0.6/eve-vault-chrome.zip - Unzip the zip file
- Open Chrome → Extension Management → Enable “Developer mode” → “Load unpacked extension” → Select unzipped folder
- Click the extension icon, use EVE Frontier SSO account (Google/Twitch, etc.) to create your Sui wallet through zkLogin
Advantage: zkLogin doesn’t need mnemonic phrases, your Sui address is uniquely derived from your OAuth identity, secure and convenient.
What’s most worth understanding here isn’t the “installation method,” but why it greatly lowers the barrier for new users:
- No need to first educate users about saving mnemonic phrases
- No need to first install a set of traditional wallet mindset
- Users can directly enter on-chain interactions using familiar account systems
For Builders, this means your dApp doesn’t have to assume users are “already experienced crypto users.” This will directly affect your product design approach:
- Login and connection flow can be shorter
- Gas experience can be further optimized with sponsored transactions
- You can focus on facility experience rather than wallet education
2.6 What Does EVE Vault Specifically Handle in This Book?
After installing EVE Vault, it will undertake three types of responsibilities in subsequent chapters:
- Wallet Hold LUX, SUI, NFTs, permission credentials
- Identity Use EVE Frontier account to enter on-chain interaction system
- Authorization Entry Provide dApps with connection, signing, sponsored transaction capabilities
You can understand it first as:
sui clientis the on-chain identity in command line,EVE Vaultis the on-chain identity in browser and dApps.
The two aren’t necessarily the same address, but they must both work properly.
When Should You Check CLI First, When Should You Check Wallet First?
This can help you locate problems faster:
- Contract compilation failure Check CLI first
- Publishing transaction failure Check CLI current network and address first
- Front-end can’t connect wallet Check EVE Vault first
- Front-end can connect wallet but buttons report permission errors Verify wallet address, character attribution, and object permissions first
Don’t attribute all on-chain problems to “wallet broken” or “CLI misconfigured.” Most of the time, you haven’t first distinguished which layer the problem occurs in.
2.7 EVE Vault Faucet: Get Test Assets
During development and testing phases, you’ll encounter at least two types of test assets:
- Test SUI Used for on-chain transaction Gas
- Test LUX Used to simulate EVE Frontier in-game economic interactions
How to get LUX:
- After installing EVE Vault, find GAS Faucet in the extension interface
- Enter your Sui address to request test tokens
- LUX will appear in your EVE Vault balance
Detailed instructions: GAS Faucet Documentation
Why Do You Need Both SUI and LUX During Testing?
Because they play different roles:
- SUI Is the Gas resource for on-chain transactions, without it many transactions can’t even be sent
- LUX Is more like an economic asset in the EVE Frontier business environment, many tutorials and cases use it to simulate in-game fees, settlements, permit purchases
If you only have SUI, no LUX:
- You can send transactions
- But many business processes can’t be practiced according to the book
If you only have LUX, no SUI:
- You’ll find it hard to complete even the most basic on-chain interactions
2.8 Minimum Acceptance Checklist
At this point, you don’t need to run scaffolding immediately, nor install front-end dependencies first. First confirm these four things:
sui --versioncan output the versionsui client active-addresscan return the current addressEVE Vaulthas completed zkLogin initialization- At least you can see test assets in wallet or can request from Faucet
If all four things are true, it means you already have the minimum environment to continue learning the first half of this book.
Three Most Common Environment Misalignments
1. CLI on testnet, wallet switched to a different network
Manifestation:
- Can query objects in terminal
- Can’t see corresponding assets or components in front-end
2. CLI address and wallet address aren’t the same, but you didn’t realize it
Manifestation:
- Contract published from one address
- dApp connected to another address
- Front-end operations prompt no permission
3. Faucet tokens received, but received to “another identity set”
Manifestation:
- You clearly claimed test coins
- But current wallet or CLI address balance is still 0
Once you encounter “I clearly did it, but the system says I didn’t” problems, don’t rush to doubt the tutorial. Verify these three things again first.
When to Install Other Tools?
- By Chapter 6: Install and use
builder-scaffold - By Chapter 7: Handle script and dApp dependencies
- By specific case chapters: Supplement front-end running environment as needed
🔖 Chapter Summary
| Step | Operation |
|---|---|
| Install Sui CLI | suiup install sui@testnet |
| Configure Sui Client | sui client select network and create address |
| Install EVE Vault | Chrome extension + zkLogin create on-chain identity |
| Get Test Assets | SUI Faucet + EVE Vault GAS Faucet |
| Verify Environment | CLI address, wallet address, network, balance all visible |
📚 Extended Reading
- Sui CLI Complete Documentation
- Sui Client Configuration Guide
- EVE Vault
- EVE Vault GAS Faucet Documentation
Chapter 3: Move Smart Contract Fundamentals
Objective: Master core concepts of the Move language, understand the Sui object model, and be able to read and modify EVE Frontier contract code.
Status: Foundation chapter. Main text focuses on Move language, object model, and minimal examples.
3.1 Move Language Overview
Move is the smart contract language used by Sui, specifically designed for the problem that “on-chain assets cannot be arbitrarily copied, discarded, or transferred.” It’s not about first writing a general programming language and then using libraries to constrain assets; rather, it treats “resources” as the most important objects at the language level.
You can first grasp three intuitions:
- Assets aren’t just balance numbers
On Sui, many assets are truly independent objects with their own
id, fields, ownership, and lifecycle - Types determine whether you can copy, store, discard Move uses an ability system to restrict what you can do with a value, preventing you from mistakenly treating “precious assets” as ordinary variables
- Contracts are more like modules + object systems What you write isn’t “one giant global state,” but a set of module functions to create, read, and modify objects
So learning Move isn’t just about learning syntax. What you really need to establish is a new way of thinking:
- First distinguish between ordinary data and resources
- Then distinguish who owns objects, who can modify them, who can transfer them
- Only then write these rules into function entries and business processes
This is also well-suited to EVE Frontier. Because in EVE, many things are naturally not “a row in a database record,” but more like independently existing assets or facilities:
- A pass NFT
- A smart stargate
- A storage unit
- A permission credential
- A kill record
When these things are in Move, the expression becomes very natural.
3.2 Module Structure
A Move contract consists of one or more modules:
// File: sources/my_contract.move
// Module declaration: package_name::module_name
module my_package::my_module {
// Import dependencies
use sui::object::{Self, UID};
use sui::tx_context::TxContext;
use sui::transfer;
// Struct definition (assets/data)
public struct MyObject has key, store {
id: UID,
value: u64,
}
// Init function (automatically executed once during contract deployment)
fun init(ctx: &mut TxContext) {
let obj = MyObject {
id: object::new(ctx),
value: 0,
};
transfer::share_object(obj);
}
// Public function (can be called externally)
public fun set_value(obj: &mut MyObject, new_value: u64) {
obj.value = new_value;
}
}
Although this code is short, it already contains the four most common types of Move elements:
- Module declaration
module my_package::my_moduleindicates “this file defines a module” - Dependency imports
useis used to introduce types or functions exposed by other modules - Struct definition
MyObjectdescribes what an on-chain object looks like - Function entries
Functions like
init,set_valuedefine how objects are created and modified
What’s the Relationship Between Modules and Packages?
Many newcomers conflate “packages” and “modules” into one thing, but they’re actually not at the same level:
- Package
Is an entire Move project directory, typically containing
Move.toml,sources/,tests/ - Module Is a code unit within a package, a package can have multiple modules
Here’s a structure closer to a real project:
my-extension/
├── Move.toml
├── sources/
│ ├── gate_logic.move
│ ├── gate_auth.move
│ └── pricing.move
└── tests/
└── gate_tests.move
Here:
my-extensionis a packagegate_logic,gate_auth,pricingare three modules
You can think of “package” as a deployment unit, and “module” as a code organization unit.
Why is init Important?
init executes once when the package is first published. Common uses include:
- Create shared objects
- Send
AdminCapto the deployer - Initialize global configuration
- Establish registry objects
It’s typically the “boot action when the system first comes online.” If you don’t create key objects properly in init, many entry functions won’t work properly later.
Why Do Fields Almost Always Start with id: UID?
Because on Sui, a true on-chain object must have UID, which represents a globally unique identity. Structs without UID are often just:
- Ordinary nested data
- Configuration items
- Event payloads
- One-time credentials
This is also your first clue when reading EVE contracts to determine “is this an independent object or not.”
3.3 Move’s Abilities (Ability System)
This is one of the most important concepts in Move. Each struct type can have the following abilities:
| Ability | Keyword | Meaning |
|---|---|---|
| Key | has key | Can be a Sui object, stored in global state |
| Store | has store | Can be nested and stored in other objects |
| Copy | has copy | Can be implicitly copied (use cautiously!) |
| Drop | has drop | Can be automatically discarded when function ends (not using it is okay) |
Don’t treat abilities as “syntax decoration.” They’re essentially answering a very serious question:
How is this value allowed to be handled by developers?
What Do the Four Abilities Mean?
1. key
has key indicates this type can exist as a top-level on-chain object.
Common characteristics:
- Usually contains
id: UID - Can be owned by an address, shared, or owned by an object
- Can be a core object for transaction reads/writes
Without key, this type cannot independently hang in on-chain global state.
2. store
has store indicates this type can be safely placed in other object fields.
For example:
- Put a configuration struct into
StorageUnit - Put a whitelist rule into
SmartAssembly - Embed metadata structure into NFT
Often, a type isn’t an independent object, but it must be able to exist as a component of other objects—this is when you need store.
3. copy
has copy indicates this value can be copied.
This is usually only suitable for:
- Small pure data
- Values that don’t represent scarce resources
- Similar to ID, boolean markers, enums, simple configurations
If something represents “permission,” “asset,” “unique credential,” it usually shouldn’t be given copy.
4. drop
has drop indicates this value can be directly discarded if not used.
This ability seems inconspicuous, but is actually quite critical. Because Move is very strict by default: if a value isn’t properly consumed, the compiler will chase you asking “what exactly do you plan to do with it?”
So:
- With
drop, not using it is okay - Without
drop, you must explicitly consume or transfer it
Why Do Abilities Directly Affect Security?
Because many security boundaries are not guarded by if judgments, but by “the type simply doesn’t allow you to do this.”
For example:
- If an NFT doesn’t have
copy, you can’t duplicate a second copy - If a hot potato object doesn’t have
drop, you can’t secretly ignore it - If a permission object doesn’t have a public construction path, external parties can’t forge it
This is one of Move’s strong points: it moves many business constraints forward into the type system.
Application in EVE Frontier
// JumpPermit: has key + store, is a real on-chain asset, cannot be copied
public struct JumpPermit has key, store {
id: UID,
character_id: ID,
route_hash: vector<u8>,
expires_at_timestamp_ms: u64,
}
// VendingAuth: only has drop, is a one-time "credential" (Witness Pattern)
public struct VendingAuth has drop {}
These two examples can be viewed together:
JumpPermitis a true object that should exist on-chain, so it haskeyVendingAuthis only a witness value in a call flow, doesn’t need on-chain persistence, so only givendrop
When reading EVE contracts, you can often directly guess the author’s intent through abilities:
has key, store: Likely a real object or quasi-object- Only
drop: Likely witness, receipt, one-time intermediate state copy, drop, store: Likely ordinary value type or configuration data
3.4 Sui Object Model Explained
On Sui, all structs with key ability are objects, divided into three ownership types:
Ownership Types
1. Address-owned
└── Only the person holding that address can access
└── Example: Player character's OwnerCap
2. Shared Object
└── Anyone can read/write on-chain (controlled by contract logic)
└── Example: Smart storage unit, stargate body
3. Object-owned
└── Held by another object, external cannot directly access
└── Example: Configuration stored inside components
These three ownership types aren’t abstract classifications, but one of the most core decisions when designing business models.
1. Address-owned: Most Like “My Assets”
Address-owned objects are typically suitable for:
- Player personal NFTs
OwnerCap- Character private credentials
- Transferable tickets, permits, badges
Characteristics are:
- Controlled by a certain address
- Transaction usually requires signature from that address
- Very suitable for expressing “who owns, who controls”
2. Shared Objects: Most Like “Public Facilities”
Shared objects are suitable for:
- Markets
- Stargates
- Storage units
- Alliance vaults
- Server-wide registries
The focus isn’t “who owns this object,” but “who can perform what operations on it under what rules.”
This is the core form of many EVE Frontier facility contracts. Because although a facility also has an operator, it’s first a public object that will be interacted with by many players together.
3. Object-owned: Most Like “Facility Internal Components”
Object ownership is commonly used to hide complex internal state, such as:
- Configuration object inside a facility
- Inventory table inside a component
- Auxiliary index inside a registry
Its benefit is encapsulating state, not letting external parties randomly take it out and misuse it.
Why is the Object Model Easier to Express Game Worlds Than “Global Mapping”?
Because many entities in games are naturally independently existing, can be referenced, can be transferred, can be composed:
- A turret
- A permit
- A character permission
- An alliance treaty
If all stuffed into one big table, logic becomes more and more like “database management scripts.” The object model is closer to real-world “entities + relationships + ownership.”
Objects Don’t Just Have Two States of “Exist or Not”
When designing, you also need to consider object lifecycle:
- Creation
Who creates it? In
initor in business entry? - Holding Who owns it after creation? Address, shared, or object internal?
- Modification
Who can get
&mut? Under what premise is modification allowed? - Transfer Can it be transferred? Do permissions follow after transfer?
- Destruction When can it disappear? Need to settle balance or reclaim resources before destruction?
Deterministic Derivation of Object IDs
In EVE Frontier, each in-game entity’s ObjectID on-chain is deterministically derived through TenantItemId:
public struct TenantItemId has copy, drop, store {
item_id: u64, // Unique ID in-game
tenant: String, // Distinguish different game server instances
}
This means after the game server knows item_id, it can pre-calculate that item’s ObjectID on-chain without waiting for on-chain response.
This is very important in the EVE scenario, because off-chain servers and on-chain objects need long-term alignment:
- Game server knows a facility’s, character’s, item’s business ID
- Contract needs to map it to on-chain object keys using stable rules
- Front-end and indexing services query according to the same rules
If this mapping isn’t stable, the entire system will be chaotic:
- Off-chain considers it the same facility
- On-chain found another object
- Data displayed by front-end and real interactive objects don’t match
So when you later see TenantItemId, derived_object, registries, you need to first realize: the author is solving not “how to write code,” but “how to keep cross-system identity consistent.”
3.5 Key Security Patterns
EVE Frontier and other Sui projects widely use several Move-specific security design patterns:
Pattern 1: Capability Pattern
Permissions are represented by holding objects, not account roles.
// Define capability object
public struct OwnerCap<phantom T> has key, store {
id: UID,
}
// Function that requires OwnerCap to call
public fun withdraw_by_owner<T: key>(
storage_unit: &mut StorageUnit,
owner_cap: &OwnerCap<T>, // Must hold this credential
ctx: &mut TxContext,
): Item {
// ...
}
Advantage: OwnerCap can be transferred, can be delegated, more flexible than account-level permissions.
You can think of Capability as “permission materialization”:
- Traditional thinking is often “determine if
sender == admin” - Move/Sui’s more common thinking is “do you hold a certain permission object”
This brings several direct benefits:
- Permissions can be transferred
- Permissions can be split
- Permissions can be made into NFT / Badge / Cap
- Permission relationships are easier to audit on-chain
Pattern 2: Typed Witness Pattern
This is the core of EVE Frontier’s extension system! Used to verify the caller is a specific package’s module.
// Builder defines a Witness type in their own package
module my_extension::custom_gate {
// Only this module can create Auth instances (because it has no public constructor)
public struct Auth has drop {}
// When calling stargate API, pass Auth {} as credential
public fun request_jump(
gate: &mut Gate,
character: &Character,
ctx: &mut TxContext,
) {
// Custom logic (e.g., check fees)
// ...
// Use Auth {} to prove call comes from this authorized module
gate::issue_jump_permit(
gate, destination, character,
Auth {}, // Witness: prove I am my_extension::custom_gate
expires_at,
ctx,
)
}
}
The Star Gate component knows your Auth type has been registered in the whitelist, so it allows the call.
This pattern feels strange the first time because Auth {} has no data inside. But what it really wants to express is:
“I’m not proving identity through field content, I’m proving which module I’m from through the type itself.”
Why is this strong?
- External modules can’t arbitrarily forge your witness type
- Components can only trust witness types in the whitelist
- So “who can call a certain underlying capability” can be restricted to specific extension packages
This is the core of EVE Frontier’s extensible components. Many components don’t simply expose a public entry for anyone to call, but require you to bring a specific witness to enter.
Pattern 3: Hot Potato
An object with no copy, store, drop abilities, must be consumed within the same transaction:
// No abilities = hot potato, must be handled in this tx
public struct NetworkCheckReceipt {}
public fun check_network(node: &NetworkNode): NetworkCheckReceipt {
// Perform check...
NetworkCheckReceipt {} // Return hot potato
}
public fun complete_action(
assembly: &mut Assembly,
receipt: NetworkCheckReceipt, // Must pass in, ensures check was executed
) {
let NetworkCheckReceipt {} = receipt; // Consume hot potato
// Formally execute operation
}
Purpose: Force certain operations to be atomically combined (e.g., “first check network node → then execute component operation”).
This pattern is especially suitable for “prerequisite checks cannot be skipped” processes:
- First verify eligibility, then mint credential
- First check network status, then execute facility action
- First read and lock a certain context, then settle
Its focus isn’t storing data, but using the type system to force callers to do things in order.
3.6 Function Visibility and Access Control
module example::access_demo {
// Private function: can only be called within this module
fun internal_logic() { }
// Package-visible: other modules in the same package can call (Layer 1 Primitives use this)
public(package) fun package_only() { }
// Entry: can be directly called as top-level of a Transaction
public fun user_action(ctx: &mut TxContext) { }
// Public: any module can call
public fun read_data(): u64 { 42 }
}
3.7 Writing Your First Move Extension Module
Let’s combine the above concepts to write a simplest Storage Unit extension:
module my_extension::simple_vault;
use world::storage_unit::{Self, StorageUnit};
use world::character::Character;
use world::inventory::Item;
use sui::tx_context::TxContext;
// Our Witness type
public struct VaultAuth has drop {}
/// Anyone can deposit items (open deposit)
public fun deposit_item(
storage_unit: &mut StorageUnit,
character: &Character,
item: Item,
ctx: &mut TxContext,
) {
// Use VaultAuth{} as witness, proving this call is a legally bound extension
storage_unit::deposit_item(
storage_unit,
character,
item,
VaultAuth {},
ctx,
)
}
/// Only characters with specific Badge (NFT) can withdraw items
public fun withdraw_item_with_badge(
storage_unit: &mut StorageUnit,
character: &Character,
_badge: &MemberBadge, // Must hold member badge to call
type_id: u64,
ctx: &mut TxContext,
): Item {
storage_unit::withdraw_item(
storage_unit,
character,
VaultAuth {},
type_id,
ctx,
)
}
3.8 Compilation and Testing
# In your Move package directory
cd my-extension
# Compile (checks types and logic)
sui move build
# Run unit tests
sui move test
# Publish to testnet
sui client publish
After successful publication, you’ll get a Package ID (like 0xabcdef...), which is your contract’s on-chain address.
🔖 Chapter Summary
| Concept | Key Points |
|---|---|
| Move Module | module package::name { } is code organization unit |
| Abilities | key(object) store(nestable) copy(copyable) drop(discardable) |
| Three Ownerships | Address-owned / Shared object / Object-owned |
| Capability Pattern | Permission = holding object, can transfer can delegate |
| Witness Pattern | Uniquely instantiated type as call credential, EVE Frontier extension core |
| Hot Potato | No-ability struct, force atomic operations |
📚 Extended Reading
- Move Book (Official)
- Sui Move Concepts
- Typed Witness Pattern
- Capability Pattern
- EVE Frontier World Contract Code
Chapter 4: Smart Assembly Development and On-Chain Deployment
Objective: Understand the working principles and APIs of each smart assembly, master the complete workflow from character creation to contract deployment.
Status: Foundation chapter. Main text focuses on deployment workflow and on-chain component operations.
4.1 Complete Deployment Workflow
Before your code can take effect in the real game, you need to complete the following complete chain:
1. Create on-chain character (Smart Character)
↓
2. Deploy network node (Network Node), deposit fuel and go online
↓
3. Anchor smart assembly (Anchor Assembly)
↓
4. Bring assembly online (Assembly Online)
↓
5. Write and publish custom Move extension package
↓
6. Register extension to assembly (authorize_extension)
↓
7. Players interact with assembly through extension API
In local development, steps 1-5 can be completed with one click using builder-scaffold initialization scripts.
Many people when first encountering this chapter will mistakenly think “publishing contracts” is the main process. Actually it’s not. For EVE Builders, the real main process is:
- First have on-chain entity
- Then have operable facilities
- Then have custom extension logic
- Finally attach extensions to facilities for player consumption
In other words, the Move package you write doesn’t work independently out of thin air. It must be attached to a smart assembly that actually exists, is already online, and is already attributed to the character system.
The Three Most Easily Confused “IDs” in Deployment
During deployment, at least three types of IDs will appear simultaneously:
- Package ID Represents your Move package published on-chain
- Object ID Represents specific objects, such as characters, stargates, turrets, storage units
- Business ID Represents character, item, facility numbers in the game server
Don’t mix these three:
Package IDdetermines “where your code is”Object IDdetermines “where your facilities and assets are”- Business ID determines “who things in the game world are”
Later you’ll frequently switch back and forth between “code address” and “facility object address.” If these two concepts aren’t separated, debugging will be very painful.
4.2 Smart Character
Smart Character is your main identity on-chain, all assemblies belong to your character.
Character’s On-Chain Structure
public struct Character has key {
id: UID, // Unique object ID
// Each owned asset corresponds to an OwnerCap
// owner_caps stored as dynamic fields
}
OwnerCap: Asset Ownership Credential
Whenever you own an assembly (network node/turret/stargate/storage unit), the character will hold the corresponding OwnerCap<T> object. All write operations to that assembly require first “borrowing” this OwnerCap from the character:
// TypeScript script example: Borrow OwnerCap
const [ownerCap] = tx.moveCall({
target: `${packageId}::character::borrow_owner_cap`,
typeArguments: [`${packageId}::assembly::Assembly`],
arguments: [tx.object(characterId), tx.object(ownerCapId)],
});
// ... use ownerCap to execute operations ...
// Must return after use
tx.moveCall({
target: `${packageId}::character::return_owner_cap`,
typeArguments: [`${packageId}::assembly::Assembly`],
arguments: [tx.object(characterId), ownerCap],
});
💡 Borrow & Return pattern combined with Hot Potato ensures OwnerCap won’t leave the character object.
Why Not Take OwnerCap Out and Hold Permanently?
Because OwnerCap isn’t an ordinary key, but a high-privilege credential. Designing it as “borrow then must return” has several direct benefits:
- Permissions won’t easily leave the character system
- After a transaction ends, won’t leave dangling high-privilege objects
- Assembly ownership still stably belongs to character, rather than scattered to script addresses or temporary objects
From a design perspective, this is equivalent to implementing “temporary privilege escalation” on-chain:
- You first prove you’re a legitimate operator of the character
- System temporarily lends you the permission object
- After completing high-privilege operations, you must return the permission
This is more flexible than “hardcoded admin address,” and more suitable for game scenarios’ delegation, transfer, inheritance, shell-changing operations and other needs.
What Role Does Character Actually Play in Business?
Don’t just understand Character as an alias for wallet address. It’s more like an on-chain “operating entity”:
- Assemblies hang under character name, not directly under wallet address name
- Character can internally manage multiple
OwnerCapuniformly - Character can serve as a bridge between on-chain permissions and in-game identity
So in many Builder scenarios, the truly stable entity isn’t “which wallet clicked the button,” but “which character is operating these facilities.”
4.3 Network Node
What is a Network Node?
- Energy station anchored at Lagrange Points
- Provides Energy for all nearby smart assemblies
- Each assembly when going online needs to “reserve” a certain amount of energy from the network node
Lifecycle
Anchored (Anchored)
↓ depositFuel (Deposit fuel)
Fueled (Fueled)
↓ online (Go online)
Online (Running) ←→ offline (Go offline)
What’s most important here isn’t remembering state names, but understanding:
Whether a facility can work depends not only on “whether the contract is published,” but also on whether it’s actually powered in the game world.
This is a key difference between EVE Frontier and ordinary dApps. In ordinary dApps, after a contract is successfully published, theoretically anyone can call it; but in EVE, many facilities’ availability is also constrained by “world state”:
- Is there a network node
- Does the network node have fuel
- Is the facility properly anchored
- Is the facility online
From Builder’s Perspective, What Does Network Node Actually Solve?
It solves the problem of “facilities shouldn’t be online unconditionally forever.”
If this layer of design didn’t exist:
- Stargates could be open forever
- Turrets could work forever
- Storage facilities could keep responding
Then many meanings of operation, maintenance, supply, and occupation in the game would be lost. With network nodes added, facilities become assets that truly need maintenance, rather than “one deployment permanent money printer.”
Initialization Scripts for Local Testing (from builder-scaffold)
# Execute in builder-scaffold/ts-scripts directory
pnpm setup:character # Create character
pnpm setup:network-node # Create and start network node
pnpm setup:assembly # Create and connect smart assembly
4.4 Smart Storage Unit (SSU) In-Depth Analysis
Two Types of Inventory
| Inventory Type | Holder | Capacity | Access Method |
|---|---|---|---|
| Primary Inventory | Assembly Owner | Large | OwnerCap<StorageUnit> |
| Ephemeral Inventory | Interacting character | Small | Character’s own OwnerCap |
Ephemeral inventory is used for non-Owner players to interact with your SSU (e.g., when purchasing items, first transfer items to ephemeral inventory, then player takes them).
How Do Items Reach the Chain?
In-game item → game_item_to_chain_inventory() → On-chain Item object
On-chain Item object → chain_item_to_game_inventory() → In-game item (requires proximity proof)
What’s truly difficult here isn’t “which function to call,” but understanding that inventories on both sides aren’t simple mirrors.
Many newcomers will default to thinking:
- There’s a gun in the game backpack
- After going on-chain it’s just “copying a record”
Actually the correct understanding is closer to:
- A certain in-game item is mapped to an on-chain object through a trusted process
- This object then enters the on-chain inventory system
- When it’s taken back to the game world, it needs to go through another trusted return path
So the essence of Storage Unit isn’t a “on-chain cabinet,” but an asset exchange node between on-chain and game world.
Why Distinguish Between Primary and Ephemeral Inventory?
Because many interactions aren’t “Owner opening the warehouse to get things themselves,” but “third-party players having a controlled interaction with your facility.”
For example, a vending machine:
- Player pays tokens
- Facility first transfers corresponding items to a temporary intermediate area
- Player then claims from that path
Benefits of doing this:
- Don’t have to fully expose main inventory to external parties
- Intermediate states of transactions are easier to audit
- Easier to do rollback and settlement when failures occur
Extension API Overview
// 1. Register extension (Owner calls)
public fun authorize_extension<Auth: drop>(
storage_unit: &mut StorageUnit,
owner_cap: &OwnerCap<StorageUnit>,
)
// 2. Extension deposits item
public fun deposit_item<Auth: drop>(
storage_unit: &mut StorageUnit,
character: &Character,
item: Item,
_auth: Auth, // Witness
ctx: &mut TxContext,
)
// 3. Extension withdraws item
public fun withdraw_item<Auth: drop>(
storage_unit: &mut StorageUnit,
character: &Character,
_auth: Auth, // Witness
type_id: u64,
ctx: &mut TxContext,
): Item
4.5 Smart Gate In-Depth Analysis
Default vs Custom Behavior
No extension: Anyone can jump
↓ authorize_extension<MyAuth>()
Has extension: Players must hold JumpPermit to jump
JumpPermit Mechanism
// Jump permit: time-limited on-chain object
public struct JumpPermit has key, store {
id: UID,
character_id: ID,
route_hash: vector<u8>, // A↔B bidirectional valid
expires_at_timestamp_ms: u64,
}
The key to JumpPermit isn’t that “it’s a ticket,” but that it splits a complex judgment into two segments:
- First decide “are you qualified to get the ticket”
- Then decide “can you execute the jump with the ticket”
This split is very suitable for game rule extensions, because “qualification judgment” can be very complex:
- Are you a whitelist member
- Have you paid
- Have you completed prerequisite quests
- Are you within the valid time window
But once the ticket is issued, the logic when actually executing the jump can be more standard and unified.
This is also a common thinking for many extension designs:
Move complex business judgments forward to “credential issuance,” converge underlying facility actions to “credential consumption.”
Complete jump process:
- Player calls your extension function (e.g.,
pay_and_request_permit()) - Extension verifies conditions (check tokens, check whitelist, etc.)
- Extension calls
gate::issue_jump_permit()to issue Permit - Permit transferred to player
- Player calls
gate::jump_with_permit()to jump, Permit consumed
Extension API
// Register extension
public fun authorize_extension<Auth: drop>(
gate: &mut Gate,
owner_cap: &OwnerCap<Gate>,
)
// Issue jump permit (only registered Auth types can call)
public fun issue_jump_permit<Auth: drop>(
source_gate: &Gate,
destination_gate: &Gate,
character: &Character,
_auth: Auth,
expires_at_timestamp_ms: u64,
ctx: &mut TxContext,
)
// Jump using permit (consumes JumpPermit)
public fun jump_with_permit(
source_gate: &Gate,
destination_gate: &Gate,
character: &Character,
jump_permit: JumpPermit,
admin_acl: &AdminACL,
clock: &Clock,
ctx: &mut TxContext,
)
What is authorize_extension Actually Authorizing?
It’s authorizing not “a certain address,” nor “a certain transaction,” but a certain type identity.
In other words, what assemblies truly trust is:
- Only calls bringing a certain designated witness type
- Can enter underlying capability entry points
This gives assembly extensions two important properties:
- Assembly kernel doesn’t need to know what your business logic looks like
- But it can very clearly know “which extension types are qualified to access”
So Builders’ work is often not “modifying official logic,” but “packaging their own logic into officially allowed typed extensions to access.”
4.6 Smart Turret In-Depth Analysis
Turret’s extension pattern is similar to stargate, authorized through Typed Witness.
Default Behavior
Turret uses standard attack logic provided by game server.
Custom Behavior
Builder can register extensions to change turret’s target judgment logic. For example:
- Allow characters holding specific NFT to pass safely
- Only attack characters not on alliance list
- Turn attack on/off based on time period (open daytime, closed nighttime)
4.7 Publishing and Registering Extension to Assembly
Step 1: Publish Your Extension Package
# In your Move package directory
sui client publish
# Output example:
# Package ID: 0x1234abcd...
# Transaction Digest: HMNaf...
Record the Package ID, this is your contract address.
After publishing completes, you should immediately record at least three types of information:
- Your
Package ID - The assembly object ID you want to bind
- Transaction digest
Because when troubleshooting problems later, almost all chains trace back from these three things:
- Was the contract successfully published
- Is the facility the object you thought it was
- Did this authorization or registration actually succeed on-chain
Step 2: Authorize Extension to Assembly
Through TypeScript script (or dApp call) to register your extension:
import { Transaction } from "@mysten/sui/transactions";
const tx = new Transaction();
// Borrow OwnerCap from character
const [ownerCap] = tx.moveCall({
target: `${WORLD_PACKAGE}::character::borrow_owner_cap`,
typeArguments: [`${WORLD_PACKAGE}::gate::Gate`],
arguments: [tx.object(CHARACTER_ID), tx.object(OWNER_CAP_ID)],
});
// Authorize extension (tell stargate: allow my_extension::custom_gate::Auth type to call)
tx.moveCall({
target: `${WORLD_PACKAGE}::gate::authorize_extension`,
typeArguments: [`${MY_PACKAGE}::custom_gate::Auth`], // Your Witness type
arguments: [tx.object(GATE_ID), ownerCap],
});
// Return OwnerCap
tx.moveCall({
target: `${WORLD_PACKAGE}::character::return_owner_cap`,
typeArguments: [`${WORLD_PACKAGE}::gate::Gate`],
arguments: [tx.object(CHARACTER_ID), ownerCap],
});
await client.signAndExecuteTransaction({ signer: keypair, transaction: tx });
Here you need to pay special attention to one thing:
“Publish successful” doesn’t equal “extension already in effect.”
You should at least confirm three layers of binding relationships are established:
- Your package is already on-chain
- Your assembly object is the correct assembly
- Assembly has added your witness type to allowed list
Step 3: Verify Registration Success
# Query stargate object, confirm extension type has been added to allowed_extensions
sui client object <GATE_ID>
4.8 Using TypeScript to Read On-Chain State
import { SuiClient } from "@mysten/sui/client";
const client = new SuiClient({ url: "https://fullnode.testnet.sui.io:443" });
// Read stargate object
const gateObject = await client.getObject({
id: GATE_ID,
options: { showContent: true },
});
console.log(gateObject.data?.content);
// GraphQL query all assemblies of specified type
const query = `
query {
objects(filter: { type: "${WORLD_PACKAGE}::gate::Gate" }) {
nodes {
address
asMoveObject { contents { json } }
}
}
}
`;
Why Does Deployment Chapter Also Need to Discuss Reading State?
Because in actual development, deployment and reading have never been two separate things. After completing each step, you need to immediately verify:
- Was the object created
- Did state switch to online
- Was extension registered successfully
- Did assembly fields change as expected
So the real rhythm is usually:
Execute one step
-> Immediately read on-chain state
-> Confirm object and field changes
-> Continue to next step
If you only know how to “send transactions” but don’t know how to “immediately verify state,” it’s hard to judge whether it’s:
- Transaction didn’t send
- Sent but wrong object
- Object correct but state didn’t change
- State changed but front-end queried wrong place
From Developer’s Perspective, What’s the Minimum Closed Loop of This Chapter?
The minimum closed loop isn’t “I published a package,” but:
- Have character
- Have facility
- Have permission credential
- Have custom extension package
- Have successful registration record
- Have one real verifiable player interaction
Only when all 6 things are completed do you truly finish a Builder facility extension.
🔖 Chapter Summary
| Deployment Step | Key Operation |
|---|---|
| 1. Character | On-chain identity, holds all OwnerCap |
| 2. Network Node | Deposit fuel → Go online → Output energy |
| 3. Assembly | Anchor → Connect node → Go online |
| 4. Extension Package | sui client publish |
| 5. Register Extension | authorize_extension<MyAuth>(gate, owner_cap) |
| 6. Player Interaction | Call your Entry functions, call world contracts through Witness |
📚 Extended Reading
- Smart Storage Unit Documentation
- Smart Gate Documentation
- Interfacing with the World
- builder-scaffold ts-scripts
Chapter 5: dApp Front-End Development and Wallet Integration
Objective: Use
@evefrontier/dapp-kitto build a front-end dApp that can connect to EVE Vault wallet, read on-chain data, and execute transactions.
Status: Foundation chapter. Main text focuses on wallet integration, front-end state reading, and transaction initiation.
5.1 The Role of dApp in EVE Frontier
After completing Move contract development, players need an interface to interact with your facilities. The dApp (decentralized application) is that interface, it can:
- Display real-time status of your smart assemblies (inventory, online status, etc.)
- Let players connect EVE Vault wallet
- Trigger on-chain transactions through UI (purchase items, apply for jump permits, etc.)
- Run in standard web browsers without downloading game client
Two Usage Scenarios
| Scenario | Description |
|---|---|
| In-Game Overlay | When players approach assemblies in-game, game client displays your dApp (iframe) |
| External Browser | Independent webpage, connects wallet through EVE Vault extension |
Many people misunderstand dApps as “wrapping contracts with a front-end skin.” In EVE Frontier, its more accurate role is:
Turn on-chain facilities into service interfaces that players actually want to use.
Because for the same facility, if there’s only a contract without a dApp, players usually lack these key pieces of information:
- What is the current state
- Do I have permission to operate
- How much does the operation cost
- What exactly happened after clicking the button
So dApps aren’t just “presentation layer,” they also bear three very practical responsibilities:
- Explain state Translate object fields into business states players can understand
- Organize transactions Help users assemble complex parameters, object IDs, amounts into a legal transaction
- Handle feedback Tell users whether they’re currently waiting for signature, waiting for on-chain, success, failure, or need to retry
A dApp’s Minimum Working Loop
Regardless of whether you’re making a shop, stargate, or turret console, front-ends basically can’t avoid this loop:
Connect wallet
-> Read assembly and user state
-> Determine currently allowed actions
-> Build transaction
-> Request signature / Initiate sponsored transaction
-> Wait for result
-> Refresh objects and interface
As long as one link in this loop isn’t done well, user experience will break.
5.2 Installing dapp-kit
# Create React project (using Vite as example)
npx create-vite my-dapp --template react-ts
cd my-dapp
# Install EVE Frontier dApp SDK and dependencies
npm install @evefrontier/dapp-kit @tanstack/react-query react
SDK Core Features Overview
| Feature | Provides |
|---|---|
| 🔌 Wallet Connection | Integration with EVE Vault and standard Sui wallets |
| 📦 Smart Object Data | Get and transform assembly data through GraphQL |
| ⚡ Sponsored Transactions | Support gas-free transactions (backend pays) |
| 🔄 Auto Polling | Real-time refresh of on-chain data |
| 🎨 Full TypeScript Types | Complete type definitions for all components |
5.3 Project Basic Configuration
Configure Provider
All dApp functionality must be wrapped in EveFrontierProvider:
// src/main.tsx
import React from 'react'
import ReactDOM from 'react-dom/client'
import { QueryClient, QueryClientProvider } from '@tanstack/react-query'
import { EveFrontierProvider } from '@evefrontier/dapp-kit'
import App from './App'
// React Query client (manage cache)
const queryClient = new QueryClient({
defaultOptions: {
queries: {
staleTime: 5 * 1000, // Refetch after 5 seconds
}
}
})
ReactDOM.createRoot(document.getElementById('root')!).render(
<React.StrictMode>
{/* EVE Frontier SDK Provider */}
<EveFrontierProvider queryClient={queryClient}>
<App />
</EveFrontierProvider>
</React.StrictMode>
)
The role of Provider isn’t just “making Hooks work.” It actually helps you uniformly manage three types of contexts:
- Wallet connection context
- On-chain query and cache context
- dApp-kit’s own environment information
So it’s essentially the “operating platform” for the entire dApp. If this layer is misconfigured, many errors that seem like business problems later are actually context not initialized properly.
Bind Assembly Through URL Parameters
dApp knows which assembly to display through URL parameters:
# In-game access:
https://your-dapp.com/?tenant=utopia&itemId=0x1234abcd...
# tenant: Game server instance name (prod/testnet/dev)
# itemId: Assembly's ObjectID on-chain
SDK automatically reads these parameters from URL, you don’t need to handle manually.
The core idea here is:
The same front-end page doesn’t serve a fixed assembly, but dynamically binds current assembly context according to URL.
This has two direct benefits:
- You can reuse the same front-end to serve many facilities
- In-game overlay only needs to pass
tenantanditemIdin, page will know “who I’m serving now”
Why Are tenant and itemId Both Essential?
itemIdsolves “which object is it”tenantsolves “which world instance it belongs to”
If only passing itemId, in multi-tenant or multi-environment scenarios, it’s easy to read data from wrong world; if only passing tenant, you don’t know which facility object it currently is.
5.4 Core Hooks Explained
Hook 1: useConnection (Wallet Connection State)
import { useConnection } from '@evefrontier/dapp-kit'
function WalletButton() {
const {
isConnected, // boolean: whether wallet connected
currentAddress, // string | null: current wallet address
handleConnect, // () => void: trigger connection flow
handleDisconnect, // () => void: disconnect
} = useConnection()
if (!isConnected) {
return (
<button onClick={handleConnect} className="connect-btn">
Connect EVE Vault Wallet
</button>
)
}
return (
<div>
<span>Connected: {currentAddress?.slice(0, 8)}...</span>
<button onClick={handleDisconnect}>Disconnect</button>
</div>
)
}
useConnection solves not simply “can pop up wallet,” but the first layer of state forking for the entire page:
- When wallet not connected, page can only show public information
- When wallet connected but no character, page might need to prompt to initialize identity first
- Only when wallet connected and has character, page can enter real interactive state
Hook 2: useSmartObject (Current Assembly Data)
import { useSmartObject } from '@evefrontier/dapp-kit'
function AssemblyStatus() {
const {
assembly, // Current assembly's complete data (inventory, state, name, etc.)
loading, // Whether loading
error, // Error message
refetch, // Manual refresh
} = useSmartObject()
if (loading) return <div className="spinner">Reading on-chain data...</div>
if (error) return <div className="error">Error: {error.message}</div>
return (
<div className="assembly-card">
<h2>{assembly?.name}</h2>
<p>Status: {assembly?.status}</p>
<p>Owner: {assembly?.owner}</p>
</div>
)
}
What’s most important here isn’t the Hook name, but developing a habit:
Pages should always center on “on-chain object state,” not on local button state.
In other words, after users click buttons, don’t just set status = success locally on front-end. A more stable approach is:
- Wait for transaction to return
- Re-read object
- Refresh UI with on-chain true state
Otherwise you’ll easily get:
- Front-end thinks it succeeded
- But on-chain object didn’t change
- Page still shows “operation completed”
Hook 3: useNotification (User Notifications)
import { useNotification } from '@evefrontier/dapp-kit'
function ActionButton() {
const { showNotification } = useNotification()
const handleAction = async () => {
try {
// ... execute transaction ...
showNotification({ type: 'success', message: 'Transaction successful!' })
} catch (e) {
showNotification({ type: 'error', message: 'Transaction failed: ' + e.message })
}
}
return <button onClick={handleAction}>Execute Operation</button>
}
The true value of notification systems isn’t “making a popup,” but breaking on-chain asynchronous processes into stages users can understand:
- Connecting wallet
- Waiting for signature
- On-chain processing
- Confirmed
- Failed, need retry
If you only give users “success / failure,” many complex transactions will seem like black boxes.
5.5 Executing On-Chain Transactions
Standard Transaction (User Pays Gas)
Use useDAppKit from @mysten/dapp-kit-react to execute:
import { useDAppKit } from '@mysten/dapp-kit-react'
import { Transaction } from '@mysten/sui/transactions'
function BuyItemButton({ storageUnitId, typeId }: Props) {
const dAppKit = useDAppKit()
const handleBuy = async () => {
// Build transaction
const tx = new Transaction()
tx.moveCall({
// Call your published extension contract function
target: `${MY_PACKAGE_ID}::vending_machine::buy_item`,
arguments: [
tx.object(storageUnitId),
tx.object(CHARACTER_ID),
tx.splitCoins(tx.gas, [tx.pure.u64(100)]), // Pay 100 SUI
tx.pure.u64(typeId),
],
})
// Sign and execute
try {
const result = await dAppKit.signAndExecuteTransaction({
transaction: tx,
})
console.log('Transaction successful!', result.digest)
} catch (e) {
console.error('Transaction failed', e)
}
}
return <button onClick={handleBuy}>Buy Item</button>
}
What Stages Does a Front-End Transaction Typically Split Into?
From front-end perspective, a transaction splits into at least 5 stages:
- Prepare parameters Are assembly ID, character ID, amount, type parameters complete
- Build transaction
Assemble objects, pure values, coin split actions, and function entries into
Transaction - Request signature Have wallet or sponsorship service confirm this transaction
- Submit execution Transaction truly enters on-chain execution
- Write back to interface Refresh UI based on digest and latest object state
Many front-end bugs don’t occur at “transaction failed,” but at steps 1 and 5:
- Parameters got wrong object
- Using old cache locally
- Transaction succeeded but page didn’t refresh
- Have digest but object query not updated yet
Sponsored Transaction (Sponsored Tx, Gas-Free)
When operation needs server verification or platform pays Gas:
import { signAndExecuteSponsoredTransaction } from '@evefrontier/dapp-kit'
const result = await signAndExecuteSponsoredTransaction({
transaction: tx,
// SDK automatically handles sponsorship logic, communicates with EVE Frontier backend
})
Sponsored transactions have better experience, but longer chain. It typically means:
- Front-end first builds transaction
- Request backend to check if sponsorship allowed
- Backend performs risk control / co-sign / pay
- User completes necessary signature
- Transaction then submitted for execution
So once sponsored transactions fail, when troubleshooting you can’t just stare at front-end, need to distinguish which layer the problem is at:
- Front-end built transaction incorrectly
- User qualifications not met
- Backend refused sponsorship
- Wallet signature stage failed
- On-chain execution itself failed
5.6 Reading On-Chain Data (GraphQL)
import {
getAssemblyWithOwner,
getObjectWithJson,
executeGraphQLQuery,
} from '@evefrontier/dapp-kit'
// Get assembly and its owner information
async function loadAssembly(assemblyId: string) {
const { moveObject, character } = await getAssemblyWithOwner(assemblyId)
console.log('Assembly data:', moveObject)
console.log('Owner character:', character)
}
// Custom GraphQL query
async function queryGates() {
const query = `
query GetGates($type: String!) {
objects(filter: { type: $type }, first: 10) {
nodes {
address
asMoveObject { contents { json } }
}
}
}
`
const data = await executeGraphQLQuery(query, {
type: `${WORLD_PACKAGE}::gate::Gate`
})
return data
}
Why Can’t Front-End Only Rely on Event Streams?
Because front-end pages typically need “current state,” not just “what historically happened.”
Events are better suited to answer:
- Who did what when
- Whether a certain action occurred
- Used for logs, notifications, timelines
Object queries are better suited to answer:
- What is this facility’s state now
- How much current inventory remains
- Who is current owner
- Whether currently online
So mature dApps are often:
- Use object queries to get current state
- Use event queries to supplement history and timeline
Only relying on events to restore current state usually becomes increasingly fragile.
5.7 Practical Utility Functions
import {
abbreviateAddress,
isOwner,
formatM3,
formatDuration,
getTxUrl,
getDatahubGameInfo,
} from '@evefrontier/dapp-kit'
// Shorten address: 0x1234...cdef
abbreviateAddress('0x1234567890abcdef')
// Check if currently connected wallet is specified object's owner
const isMine = isOwner(assembly, currentAddress)
// Format volume
formatM3(1500) // "1.5 m³"
// Format time
formatDuration(3661000) // "1h 1m 1s"
// Get transaction browser link
getTxUrl('HNFaf...') // Returns Sui Explorer URL
// Get game item metadata (name, icon, etc.)
const info = await getDatahubGameInfo(83463)
console.log(info.name, info.iconUrl)
These utility functions seem like scraps, but they directly determine whether your front-end will seem “like a product rather than a script page.”
For example:
- Addresses not abbreviated, page becomes hard to read
- Amounts and volumes not formatted, players have difficulty judging quickly
- No transaction links, when problems occur users and developers can’t track
Front-end product feel is often piled up by these small functions.
5.8 Complete dApp Example
// src/App.tsx
import { useConnection, useSmartObject, useNotification } from '@evefrontier/dapp-kit'
import { useDAppKit } from '@mysten/dapp-kit-react'
import { Transaction } from '@mysten/sui/transactions'
export default function App() {
const { isConnected, handleConnect, currentAddress } = useConnection()
const { assembly, loading } = useSmartObject()
const { showNotification } = useNotification()
const dAppKit = useDAppKit()
const handleJump = async () => {
if (!isConnected) {
showNotification({ type: 'warning', message: 'Please connect wallet first' })
return
}
const tx = new Transaction()
tx.moveCall({
target: `${MY_PACKAGE}::toll_gate::pay_and_jump`,
arguments: [
tx.object(GATE_ID),
tx.object(DEST_GATE_ID),
tx.object(CHARACTER_ID),
tx.splitCoins(tx.gas, [tx.pure.u64(100)]),
],
})
try {
await dAppKit.signAndExecuteTransaction({ transaction: tx })
showNotification({ type: 'success', message: 'Jump successful!' })
} catch (e: any) {
showNotification({ type: 'error', message: e.message })
}
}
if (loading) return <div>Loading...</div>
return (
<div className="app">
<header>
<h1>🌀 Stargate Console</h1>
{!isConnected
? <button onClick={handleConnect}>Connect Wallet</button>
: <span>✅ {currentAddress?.slice(0, 8)}...</span>
}
</header>
<main>
<div className="gate-info">
<h2>{assembly?.name ?? 'Unknown Gate'}</h2>
<p>Status: {assembly?.status}</p>
</div>
<button
className="jump-btn"
onClick={handleJump}
disabled={!isConnected}
>
💳 Pay 100 SUI and Jump
</button>
</main>
</div>
)
}
Although this example is simple, it has completely demonstrated a minimal interaction loop:
- Connect wallet
- Read current assembly
- Build transaction
- Request signature and execute
- Give result notification
In Real Projects Usually Need to Add Three More Layers of State
Example works, but if you want to make it a stable product, usually need to add:
- Local UI state For example button loading, modal open/close, form input
- Wallet state Current address, whether authorized, whether switched to wrong network
- On-chain object state Facility state, inventory, price, current owner
Don’t mix these three layers of state into one layer. They update at different speeds, have different reliability, different troubleshooting methods.
5.9 Embedding dApp in Game
When approaching your assembly in-game, client will load your registered dApp URL in an overlay. Configuration method:
- Deploy dApp to public URL (like Vercel, Netlify)
- Set your dApp URL in assembly configuration
- Game client will automatically open and pass in
?itemId=...&tenant=...parameters when players interact
Related documentation: Connecting In-Game | Customizing External dApps
Biggest Difference Between In-Game Overlay and External Browser
Although both are called dApps, runtime constraints aren’t completely the same:
- In-Game Overlay More like embedded page controlled by host environment, focus is fast, stable, clear parameters, short interaction path
- External Browser More like independent Web application, can accommodate more complete page structure and longer interaction processes
So when making in-game dApps, usually need extra attention to:
- Page first screen must be fast
- Can’t rely on too complex multi-page jumps
- When parameters lost need fallback prompts
- When wallet not connected, character not initialized, facility not online, need clear state pages
🔖 Chapter Summary
| Knowledge Point | Core Points |
|---|---|
| Provider Configuration | <EveFrontierProvider> wraps entire application |
| URL Parameters | ?tenant=&itemId= binds on-chain assembly |
useConnection | Wallet connection state and operations |
useSmartObject | Auto-polling assembly on-chain data |
| Execute Transaction | dAppKit.signAndExecuteTransaction() |
| Sponsored Transaction | signAndExecuteSponsoredTransaction() gas-free |
| Read Data | GraphQL / getAssemblyWithOwner() |
📚 Extended Reading
- dapp-kit Complete Documentation
- TypeDoc API Documentation
- Connecting from an External Browser
- @mysten/dapp-kit-react Documentation
Practical Example 1: Whitelisted Mining Zone Guard (Smart Turret Access Control)
Goal: Write a smart turret extension that only allows players holding a “Mining Pass NFT” to pass; build a management interface that enables the Owner to issue passes online.
Status: Mapped to local code directory. The content covers Pass NFT and turret whitelist logic, suitable as a complete Builder closed-loop example.
Corresponding Code Directory
Minimal Call Chain
Owner issues pass -> Player holds MiningPass -> Turret extension reads credentials -> Grant passage or fire
Requirements Analysis
Scenario: Your alliance has mined a rare mineral zone in deep space and deployed a smart turret to protect the base. You want to treat different roles differently:
- ✅ Alliance members: Hold
MiningPassNFT, turret grants passage - ❌ Non-members: No
MiningPass, turret automatically fires
Additional requirements:
- Owner (you) can issue
MiningPassto trusted roles via dApp MiningPasscan be revoked by Owner- dApp displays current protection status and pass holder list
Part 1: Move Contract Development
Directory Structure
mining-guard/
├── Move.toml
└── sources/
├── mining_pass.move # NFT definition
└── guard_extension.move # Turret extension
Step 1: Define MiningPass NFT
// sources/mining_pass.move
module mining_guard::mining_pass;
use sui::object::{Self, UID};
use sui::tx_context::TxContext;
use sui::transfer;
use sui::event;
/// Mining zone pass NFT
public struct MiningPass has key, store {
id: UID,
holder_name: vector<u8>, // Holder name (for identification)
issued_at_ms: u64, // Issue timestamp
zone_id: u64, // Which mining zone (supports multiple zones)
}
/// Admin capability (only held by contract deployer)
public struct AdminCap has key, store {
id: UID,
}
/// Event: New pass issued
public struct PassIssued has copy, drop {
pass_id: ID,
recipient: address,
zone_id: u64,
}
/// Contract initialization: Deployer receives AdminCap
fun init(ctx: &mut TxContext) {
let admin_cap = AdminCap {
id: object::new(ctx),
};
// Transfer AdminCap to deployer address
transfer::transfer(admin_cap, ctx.sender());
}
/// Issue mining zone pass (only callable by AdminCap holder)
public fun issue_pass(
_admin_cap: &AdminCap, // Verify caller is admin
recipient: address, // Recipient address
holder_name: vector<u8>,
zone_id: u64,
ctx: &mut TxContext,
) {
let pass = MiningPass {
id: object::new(ctx),
holder_name,
issued_at_ms: ctx.epoch_timestamp_ms(),
zone_id,
};
// Emit event
event::emit(PassIssued {
pass_id: object::id(&pass),
recipient,
zone_id,
});
// Transfer pass to recipient
transfer::transfer(pass, recipient);
}
/// Revoke pass
/// Owner can destroy a specific role's pass via admin_cap
/// (Actually, you can design "recall + destroy", here simplified to let holder burn it themselves)
public fun revoke_pass(
_admin_cap: &AdminCap,
pass: MiningPass,
) {
let MiningPass { id, .. } = pass;
id.delete();
}
/// Check if pass belongs to a specific zone
public fun is_valid_for_zone(pass: &MiningPass, zone_id: u64): bool {
pass.zone_id == zone_id
}
Step 2: Write Turret Extension
// sources/guard_extension.move
module mining_guard::guard_extension;
use mining_guard::mining_pass::{Self, MiningPass};
use world::turret::{Self, Turret};
use world::character::Character;
use sui::tx_context::TxContext;
/// Turret extension Witness type
public struct GuardAuth has drop {}
/// Protected zone ID (this version protects zone 1)
const PROTECTED_ZONE_ID: u64 = 1;
/// Request safe passage (player with pass is allowed by turret)
///
/// Note: The actual turret's "no-fire" logic is executed by the game server,
/// this contract is used to verify and record the permission intent
public fun request_safe_passage(
turret: &mut Turret,
character: &Character,
pass: &MiningPass, // Must hold pass
ctx: &mut TxContext,
) {
// Verify pass belongs to correct zone
assert!(
mining_pass::is_valid_for_zone(pass, PROTECTED_ZONE_ID),
0 // Error code: Invalid zone pass
);
// Call turret's safe passage function, pass GuardAuth{} as extension credential
// (Actual API depends on world contract)
turret::grant_safe_passage(
turret,
character,
GuardAuth {},
ctx,
);
}
Step 3: Compile and Publish
cd mining-guard
# Compile check
sui move build
# Publish to testnet
sui client publish
# Record output:
# Package ID: 0x_YOUR_PACKAGE_ID_
# AdminCap Object ID: 0x_YOUR_ADMIN_CAP_
Step 4: Register Extension to Turret
// scripts/register-extension.ts
import { Transaction } from "@mysten/sui/transactions";
import { SuiClient } from "@mysten/sui/client";
import { Ed25519Keypair } from "@mysten/sui/keypairs/ed25519";
const WORLD_PACKAGE = "0x...";
const MY_PACKAGE = "0x_YOUR_PACKAGE_ID_";
const TURRET_ID = "0x...";
const CHARACTER_ID = "0x...";
const OWNER_CAP_ID = "0x...";
async function registerExtension() {
const client = new SuiClient({ url: "https://fullnode.testnet.sui.io:443" });
const keypair = Ed25519Keypair.fromSecretKey(/* your key */);
const tx = new Transaction();
// 1. Borrow turret's OwnerCap from character
const [ownerCap] = tx.moveCall({
target: `${WORLD_PACKAGE}::character::borrow_owner_cap`,
typeArguments: [`${WORLD_PACKAGE}::turret::Turret`],
arguments: [tx.object(CHARACTER_ID), tx.object(OWNER_CAP_ID)],
});
// 2. Register our extension
tx.moveCall({
target: `${WORLD_PACKAGE}::turret::authorize_extension`,
typeArguments: [`${MY_PACKAGE}::guard_extension::GuardAuth`],
arguments: [tx.object(TURRET_ID), ownerCap],
});
// 3. Return OwnerCap
tx.moveCall({
target: `${WORLD_PACKAGE}::character::return_owner_cap`,
typeArguments: [`${WORLD_PACKAGE}::turret::Turret`],
arguments: [tx.object(CHARACTER_ID), ownerCap],
});
const result = await client.signAndExecuteTransaction({
signer: keypair,
transaction: tx,
});
console.log("Extension registration successful! Tx:", result.digest);
}
registerExtension();
Part 2: Admin dApp
Feature: Pass Issuance Interface
// src/AdminPanel.tsx
import { useState } from 'react'
import { useDAppKit } from '@mysten/dapp-kit-react'
import { useConnection } from '@evefrontier/dapp-kit'
import { Transaction } from '@mysten/sui/transactions'
const MY_PACKAGE = "0x_YOUR_PACKAGE_ID_"
const ADMIN_CAP_ID = "0x_YOUR_ADMIN_CAP_"
export function AdminPanel() {
const { isConnected, handleConnect } = useConnection()
const dAppKit = useDAppKit()
const [recipient, setRecipient] = useState('')
const [holderName, setHolderName] = useState('')
const [status, setStatus] = useState('')
const issuePass = async () => {
if (!recipient || !holderName) {
setStatus('❌ Please fill in recipient address and name')
return
}
const tx = new Transaction()
tx.moveCall({
target: `${MY_PACKAGE}::mining_pass::issue_pass`,
arguments: [
tx.object(ADMIN_CAP_ID),
tx.pure.address(recipient),
tx.pure.vector('u8', Array.from(new TextEncoder().encode(holderName))),
tx.pure.u64(1), // Zone ID
],
})
try {
setStatus('⏳ Submitting transaction...')
const result = await dAppKit.signAndExecuteTransaction({ transaction: tx })
setStatus(`✅ Pass issued! Tx: ${result.digest.slice(0, 12)}...`)
} catch (e: any) {
setStatus(`❌ Failed: ${e.message}`)
}
}
if (!isConnected) {
return (
<div className="admin-panel">
<button onClick={handleConnect}>🔗 Connect Admin Wallet</button>
</div>
)
}
return (
<div className="admin-panel">
<h2>🛡 Mining Pass Management</h2>
<div className="form-group">
<label>Recipient Sui Address</label>
<input
value={recipient}
onChange={e => setRecipient(e.target.value)}
placeholder="0x..."
/>
</div>
<div className="form-group">
<label>Holder Name</label>
<input
value={holderName}
onChange={e => setHolderName(e.target.value)}
placeholder="Mining Corp Alpha"
/>
</div>
<button className="issue-btn" onClick={issuePass}>
📜 Issue Mining Pass
</button>
{status && <p className="status">{status}</p>}
</div>
)
}
Part 3: Player dApp
// src/PlayerPanel.tsx
import { useConnection, useSmartObject } from '@evefrontier/dapp-kit'
import { useDAppKit } from '@mysten/dapp-kit-react'
import { Transaction } from '@mysten/sui/transactions'
const MY_PACKAGE = "0x_YOUR_PACKAGE_ID_"
const TURRET_ID = "0x..."
const CHARACTER_ID = "0x..."
export function PlayerPanel() {
const { isConnected, handleConnect } = useConnection()
const { assembly, loading } = useSmartObject()
const dAppKit = useDAppKit()
const [passId, setPassId] = useState('')
const [status, setStatus] = useState('')
const requestPassage = async () => {
const tx = new Transaction()
tx.moveCall({
target: `${MY_PACKAGE}::guard_extension::request_safe_passage`,
arguments: [
tx.object(TURRET_ID),
tx.object(CHARACTER_ID),
tx.object(passId), // Player's MiningPass Object ID
],
})
try {
await dAppKit.signAndExecuteTransaction({ transaction: tx })
setStatus('✅ Safe passage recorded, turret will grant access')
} catch (e: any) {
setStatus('❌ Pass verification failed, cannot enter mining zone')
}
}
if (!isConnected) return <button onClick={handleConnect}>Connect Wallet</button>
if (loading) return <div>Loading turret status...</div>
return (
<div className="player-panel">
<h2>⚡ {assembly?.name ?? 'Mining Zone Guard Turret'}</h2>
<p>Status: {assembly?.status}</p>
<div className="pass-input">
<label>Enter your Mining Pass Object ID</label>
<input
value={passId}
onChange={e => setPassId(e.target.value)}
placeholder="0x..."
/>
<button onClick={requestPassage}>🛡 Request Safe Passage</button>
</div>
{status && <p>{status}</p>}
</div>
)
}
🎯 Complete Implementation Review
1. Move Contracts
├── mining_pass.move → Define MiningPass NFT + AdminCap + issue_pass / revoke_pass
└── guard_extension.move → Turret extension + request_safe_passage (verify pass then call turret API)
2. Registration Flow
└── authorize_extension<GuardAuth>(turret, owner_cap)
3. Admin dApp
└── Enter address and name → Call issue_pass → Transfer NFT to target role
4. Player dApp
└── Enter pass ID → Call request_safe_passage → Turret passage record on-chain
🔧 Extension Exercises
- Add expiration time to
MiningPass, turret denies passage after expiration - Record all active passes in contract for dApp query and display
- Implement “team license”: one pass can be used by multiple predefined members
📚 Related Documentation
- Smart Turret Documentation
- Chapter 3: Move Security Patterns
- Chapter 4: Register Extensions to Components
Practical Example 2: Space Highway Toll Station (Smart Stargate Toll System)
Goal: Write a smart stargate extension that charges LUX tokens per jump; build a player-facing ticket purchase dApp interface.
Status: Mapped to local code directory. The content covers toll stargate, tickets, and treasury triple set, one of the most typical Builder commercialization examples.
Corresponding Code Directory
Minimal Call Chain
Player pays toll -> Treasury receives payment -> Mint JumpTicket -> Stargate verifies ticket -> Complete jump
Requirements Analysis
Scenario: You and your alliance control a strategic corridor consisting of two stargates, connecting two busy regions of the universe. You decide to commercialize this route:
- 🎟 Any player wanting to jump must pay 50 LUX to purchase a
JumpTicket - 🏦 All collected LUX goes into the treasury (contract-managed shared object)
- 💰 Only the Owner (you) can withdraw LUX from the treasury
- 📊 dApp displays current ticket price, jump count, and treasury balance in real-time
Part 1: Move Contract Development
Directory Structure
toll-gate/
├── Move.toml
└── sources/
├── treasury.move # Treasury: Collect and manage LUX
└── toll_gate.move # Stargate extension: Toll logic
Step 1: Define Treasury Contract
// sources/treasury.move
module toll_gate::treasury;
use sui::object::{Self, UID};
use sui::balance::{Self, Balance};
use sui::coin::{Self, Coin};
use sui::sui::SUI;
use sui::tx_context::TxContext;
use sui::transfer;
use sui::event;
// ── Type Definitions ─────────────────────────────────────────────
/// Here we use SUI token to represent LUX (demo)
/// In actual deployment, replace with LUX Coin type
/// Treasury: Collect all tolls
public struct TollTreasury has key {
id: UID,
balance: Balance<SUI>,
total_jumps: u64, // Total jump count (for statistics)
toll_amount: u64, // Current ticket price (in MIST, 1 SUI = 10^9 MIST)
}
/// OwnerCap: Only holders can withdraw treasury funds
public struct TreasuryOwnerCap has key, store {
id: UID,
}
// ── Events ──────────────────────────────────────────────────
public struct TollCollected has copy, drop {
payer: address,
amount: u64,
total_jumps: u64,
}
public struct TollWithdrawn has copy, drop {
recipient: address,
amount: u64,
}
// ── Initialization ────────────────────────────────────────────────
fun init(ctx: &mut TxContext) {
// Create treasury (shared object, anyone can deposit)
let treasury = TollTreasury {
id: object::new(ctx),
balance: balance::zero(),
total_jumps: 0,
toll_amount: 50_000_000_000, // 50 SUI (unit: MIST)
};
// Create Owner credential (transfer to deployer)
let owner_cap = TreasuryOwnerCap {
id: object::new(ctx),
};
transfer::share_object(treasury);
transfer::transfer(owner_cap, ctx.sender());
}
// ── Public Functions ──────────────────────────────────────────────
/// Deposit toll (called by stargate extension)
public fun deposit_toll(
treasury: &mut TollTreasury,
payment: Coin<SUI>,
payer: address,
) {
let amount = coin::value(&payment);
// Verify correct amount
assert!(amount >= treasury.toll_amount, 1); // E_INSUFFICIENT_FEE
treasury.total_jumps = treasury.total_jumps + 1;
balance::join(&mut treasury.balance, coin::into_balance(payment));
event::emit(TollCollected {
payer,
amount,
total_jumps: treasury.total_jumps,
});
}
/// Withdraw treasury LUX (only callable by TreasuryOwnerCap holder)
public fun withdraw(
treasury: &mut TollTreasury,
_cap: &TreasuryOwnerCap,
amount: u64,
ctx: &mut TxContext,
) {
let coin = coin::take(&mut treasury.balance, amount, ctx);
transfer::public_transfer(coin, ctx.sender());
event::emit(TollWithdrawn {
recipient: ctx.sender(),
amount,
});
}
/// Change ticket price (Owner calls)
public fun set_toll_amount(
treasury: &mut TollTreasury,
_cap: &TreasuryOwnerCap,
new_amount: u64,
) {
treasury.toll_amount = new_amount;
}
/// Read current ticket price
public fun toll_amount(treasury: &TollTreasury): u64 {
treasury.toll_amount
}
/// Read treasury balance
public fun balance_amount(treasury: &TollTreasury): u64 {
balance::value(&treasury.balance)
}
Step 2: Write Stargate Extension
// sources/toll_gate.move
module toll_gate::toll_gate_ext;
use toll_gate::treasury::{Self, TollTreasury};
use world::gate::{Self, Gate};
use world::character::Character;
use sui::coin::Coin;
use sui::sui::SUI;
use sui::clock::Clock;
use sui::tx_context::TxContext;
/// Stargate extension Witness type
public struct TollAuth has drop {}
/// Default jump permit validity: 15 minutes
const PERMIT_DURATION_MS: u64 = 15 * 60 * 1000;
/// Pay toll and get jump permit
public fun pay_toll_and_get_permit(
source_gate: &Gate,
destination_gate: &Gate,
character: &Character,
treasury: &mut TollTreasury,
payment: Coin<SUI>,
clock: &Clock,
ctx: &mut TxContext,
) {
// 1. Collect toll
treasury::deposit_toll(treasury, payment, ctx.sender());
// 2. Calculate Permit expiration time
let expires_at = clock.timestamp_ms() + PERMIT_DURATION_MS;
// 3. Request jump permit from stargate (TollAuth{} is extension credential)
gate::issue_jump_permit(
source_gate,
destination_gate,
character,
TollAuth {},
expires_at,
ctx,
);
// Note: JumpPermit object is automatically transferred to character's Owner
}
Step 3: Publish Contract
cd toll-gate
sui move build
sui client publish
# Record:
# Package ID: 0x_TOLL_PACKAGE_
# TollTreasury ID: 0x_TREASURY_ID_ (shared object)
# TreasuryOwnerCap ID: 0x_OWNER_CAP_ID_
Step 4: Register Extension to Stargate
// scripts/authorize-toll-gate.ts
import { Transaction } from "@mysten/sui/transactions";
import { SuiClient } from "@mysten/sui/client";
const WORLD_PACKAGE = "0x...";
const TOLL_PACKAGE = "0x_TOLL_PACKAGE_";
const GATE_ID = "0x...";
const CHARACTER_ID = "0x...";
const GATE_OWNER_CAP_ID = "0x...";
async function authorizeTollGate() {
const client = new SuiClient({ url: "https://fullnode.testnet.sui.io:443" });
const tx = new Transaction();
// Borrow stargate OwnerCap
const [ownerCap] = tx.moveCall({
target: `${WORLD_PACKAGE}::character::borrow_owner_cap`,
typeArguments: [`${WORLD_PACKAGE}::gate::Gate`],
arguments: [tx.object(CHARACTER_ID), tx.object(GATE_OWNER_CAP_ID)],
});
// Register TollAuth as authorized extension
tx.moveCall({
target: `${WORLD_PACKAGE}::gate::authorize_extension`,
typeArguments: [`${TOLL_PACKAGE}::toll_gate_ext::TollAuth`],
arguments: [tx.object(GATE_ID), ownerCap],
});
// Return OwnerCap
tx.moveCall({
target: `${WORLD_PACKAGE}::character::return_owner_cap`,
typeArguments: [`${WORLD_PACKAGE}::gate::Gate`],
arguments: [tx.object(CHARACTER_ID), ownerCap],
});
const result = await client.signAndExecuteTransaction({
signer: keypair,
transaction: tx,
});
console.log("Toll station extension registered successfully!", result.digest);
}
Part 2: Player Ticket Purchase dApp
Complete Ticket Purchase Interface
// src/TollGateApp.tsx
import { useState, useEffect } from 'react'
import { useConnection, useSmartObject, getObjectWithJson } from '@evefrontier/dapp-kit'
import { useDAppKit } from '@mysten/dapp-kit-react'
import { Transaction } from '@mysten/sui/transactions'
const WORLD_PACKAGE = "0x..."
const TOLL_PACKAGE = "0x_TOLL_PACKAGE_"
const SOURCE_GATE_ID = "0x..."
const DEST_GATE_ID = "0x..."
const CHARACTER_ID = "0x..."
const TREASURY_ID = "0x_TREASURY_ID_"
interface TreasuryData {
toll_amount: string
total_jumps: string
balance: string
}
export function TollGateApp() {
const { isConnected, handleConnect, currentAddress } = useConnection()
const { assembly, loading } = useSmartObject()
const dAppKit = useDAppKit()
const [treasury, setTreasury] = useState<TreasuryData | null>(null)
const [txStatus, setTxStatus] = useState('')
const [isPaying, setIsPaying] = useState(false)
// Load treasury data
const loadTreasury = async () => {
const data = await getObjectWithJson(TREASURY_ID)
if (data?.content?.dataType === 'moveObject') {
setTreasury(data.content.fields as TreasuryData)
}
}
useEffect(() => {
loadTreasury()
const interval = setInterval(loadTreasury, 10_000) // Refresh every 10 seconds
return () => clearInterval(interval)
}, [])
const payAndJump = async () => {
if (!isConnected) {
setTxStatus('❌ Please connect wallet first')
return
}
setIsPaying(true)
setTxStatus('⏳ Submitting transaction...')
const tollAmount = BigInt(treasury?.toll_amount ?? 50_000_000_000)
const tx = new Transaction()
// Split out ticket price amount of SUI
const [paymentCoin] = tx.splitCoins(tx.gas, [
tx.pure.u64(tollAmount)
])
// Call toll and get Permit
tx.moveCall({
target: `${TOLL_PACKAGE}::toll_gate_ext::pay_toll_and_get_permit`,
arguments: [
tx.object(SOURCE_GATE_ID),
tx.object(DEST_GATE_ID),
tx.object(CHARACTER_ID),
tx.object(TREASURY_ID),
paymentCoin,
tx.object('0x6'), // Clock system object
],
})
try {
const result = await dAppKit.signAndExecuteTransaction({
transaction: tx,
})
setTxStatus(`✅ Jump permit obtained! Tx: ${result.digest.slice(0, 12)}...`)
loadTreasury() // Refresh treasury data
} catch (e: any) {
setTxStatus(`❌ ${e.message}`)
} finally {
setIsPaying(false)
}
}
const tollInSui = treasury
? (Number(treasury.toll_amount) / 1e9).toFixed(2)
: '...'
const balanceInSui = treasury
? (Number(treasury.balance) / 1e9).toFixed(2)
: '...'
return (
<div className="toll-gate-app">
{/* Stargate Info */}
<header className="gate-header">
<div className="gate-icon">🌀</div>
<div>
<h1>{loading ? '...' : assembly?.name ?? 'Stargate'}</h1>
<span className={`status-badge ${assembly?.status?.toLowerCase()}`}>
{assembly?.status ?? 'Detecting...'}
</span>
</div>
</header>
{/* Toll Info */}
<section className="toll-info">
<div className="info-card">
<span className="label">💰 Current Price</span>
<span className="value">{tollInSui} SUI</span>
</div>
<div className="info-card">
<span className="label">🚀 Total Jumps</span>
<span className="value">{treasury?.total_jumps ?? '...'} times</span>
</div>
<div className="info-card">
<span className="label">🏦 Treasury Balance</span>
<span className="value">{balanceInSui} SUI</span>
</div>
</section>
{/* Jump Action */}
<section className="jump-section">
{!isConnected ? (
<button className="connect-btn" onClick={handleConnect}>
🔗 Connect EVE Vault Wallet
</button>
) : (
<>
<div className="wallet-info">
✅ {currentAddress?.slice(0, 6)}...{currentAddress?.slice(-4)}
</div>
<button
className="jump-btn"
onClick={payAndJump}
disabled={isPaying || assembly?.status !== 'Online'}
>
{isPaying ? '⏳ Processing...' : `🛸 Pay ${tollInSui} SUI and Jump`}
</button>
</>
)}
{txStatus && (
<div className={`tx-status ${txStatus.startsWith('✅') ? 'success' : 'error'}`}>
{txStatus}
</div>
)}
</section>
{/* Destination Info */}
<section className="destination-info">
<p>📍 Destination: <strong>Alpha Centauri Mining Zone</strong></p>
<p>⏱ Permit Validity: <strong>15 minutes</strong></p>
</section>
</div>
)
}
Part 3: Owner Management Panel
// src/OwnerPanel.tsx
import { useDAppKit } from '@mysten/dapp-kit-react'
import { Transaction } from '@mysten/sui/transactions'
const TOLL_PACKAGE = "0x_TOLL_PACKAGE_"
const TREASURY_ID = "0x_TREASURY_ID_"
const OWNER_CAP_ID = "0x_OWNER_CAP_ID_"
export function OwnerPanel({ treasuryBalance }: { treasuryBalance: number }) {
const dAppKit = useDAppKit()
const [withdrawAmount, setWithdrawAmount] = useState('')
const [newToll, setNewToll] = useState('')
const [status, setStatus] = useState('')
const withdraw = async () => {
const amountMist = Math.floor(parseFloat(withdrawAmount) * 1e9)
const tx = new Transaction()
tx.moveCall({
target: `${TOLL_PACKAGE}::treasury::withdraw`,
arguments: [
tx.object(TREASURY_ID),
tx.object(OWNER_CAP_ID),
tx.pure.u64(amountMist),
],
})
try {
await dAppKit.signAndExecuteTransaction({ transaction: tx })
setStatus(`✅ Withdrawn ${withdrawAmount} SUI`)
} catch (e: any) {
setStatus(`❌ ${e.message}`)
}
}
const updateToll = async () => {
const amountMist = Math.floor(parseFloat(newToll) * 1e9)
const tx = new Transaction()
tx.moveCall({
target: `${TOLL_PACKAGE}::treasury::set_toll_amount`,
arguments: [
tx.object(TREASURY_ID),
tx.object(OWNER_CAP_ID),
tx.pure.u64(amountMist),
],
})
try {
await dAppKit.signAndExecuteTransaction({ transaction: tx })
setStatus(`✅ Ticket price updated to ${newToll} SUI`)
} catch (e: any) {
setStatus(`❌ ${e.message}`)
}
}
return (
<div className="owner-panel">
<h2>⚙️ Toll Station Management</h2>
<div className="panel-section">
<h3>💵 Withdraw Revenue</h3>
<p>Treasury Balance: {(treasuryBalance / 1e9).toFixed(2)} SUI</p>
<input
type="number"
value={withdrawAmount}
onChange={e => setWithdrawAmount(e.target.value)}
placeholder="Withdraw amount (SUI)"
/>
<button onClick={withdraw}>Withdraw to Wallet</button>
</div>
<div className="panel-section">
<h3>🏷 Adjust Price</h3>
<input
type="number"
value={newToll}
onChange={e => setNewToll(e.target.value)}
placeholder="New price (SUI)"
/>
<button onClick={updateToll}>Update Price</button>
</div>
{status && <p className="status">{status}</p>}
</div>
)
}
🎯 Complete Implementation Review
Move Contract Layer
├── treasury.move
│ ├── TollTreasury (shared treasury object)
│ ├── TreasuryOwnerCap (withdrawal credential)
│ ├── deposit_toll() ← Extension calls
│ ├── withdraw() ← Owner calls
│ └── set_toll_amount() ← Owner calls
│
└── toll_gate_ext.move
├── TollAuth (Witness type)
└── pay_toll_and_get_permit() ← Player calls
├── 1. Verify and charge → treasury.deposit_toll()
└── 2. Issue permit → gate::issue_jump_permit()
dApp Layer
├── TollGateApp.tsx → Player ticket purchase interface
│ ├── Real-time display of price, jump count, treasury balance
│ └── One-click payment and get JumpPermit
└── OwnerPanel.tsx → Admin panel
├── Withdraw treasury revenue
└── Adjust ticket price
🔧 Extension Exercises
- Tiered Membership: Alliance members holding membership NFT get discounts (check NFT then apply different prices)
- Limited-Time Free Passage: Automatically accept 0 LUX Permits during specific time periods (e.g., maintenance)
- Revenue Distribution: Treasury revenue automatically distributed to multiple alliance stakeholder addresses by proportion
- History dApp: Listen to
TollCollectedevents, display recent 50 jump records
📚 Related Documentation
- Smart Gate Documentation
- Interfacing with the World
- Chapter 3: Move Resources and Coin Model
- Chapter 5: dApp Initiating On-Chain Transactions
- builder-scaffold Smart Gate Example
Chapter 6: Complete Guide to Builder Scaffold (Part 1) — Project Structure and Contract Development
Learning Objectives: Master the complete directory structure of
builder-scaffold, understand both Docker and native development workflows, and be able to independently complete local development and deployment of the smart_gate contract.
Status: Mapped to local scaffold directory. Commands in this text are based on the existing
builder-scaffolddirectory in this repository.
Minimal Call Chain
Start local chain -> Compile smart_gate -> Deploy -> Record package/object id -> Configure rules -> Issue permit
Corresponding Code Directory
1. What is Builder Scaffold?
builder-scaffold is the official one-stop Builder development scaffold provided by EVE Frontier, including:
- Move Contract Templates: Two complete Smart Gate Extension examples
- TypeScript Interaction Scripts: Ready-to-use on-chain interaction scripts after deployment
- Docker Development Environment: Zero-configuration, out-of-the-box local chain
- dApp Template: React + EVE Frontier dapp-kit frontend starting point
builder-scaffold/
├── docker/ # Docker dev environment (Sui CLI + Node.js container)
├── move-contracts/ # Move contract examples
│ ├── smart_gate/ # Main example: Star Gate Extension
│ ├── storage_unit/ # Storage Unit Extension example
│ └── tokens/ # Token contract example
├── ts-scripts/ # TypeScript interaction scripts
│ ├── smart_gate/ # 6 operation scripts for smart_gate
│ ├── utils/ # Common utilities: env config, derive-object-id, proof
│ └── helpers/ # Helper functions for querying OwnerCap, etc.
├── dapps/ # React dApp template (EVE Frontier dapp-kit)
└── docs/ # Complete deployment flow documentation
The most important thing about this chapter isn’t memorizing the directory structure, but understanding:
builder-scaffoldisn’t just an example repository; it’s actually pre-wiring “local chain, contracts, scripts, and frontend” together for you.
So the real value is:
- Reducing the cost of getting the full loop working for the first time
- Giving you a standard skeleton that can be modified and run iteratively
- Making future custom development start from “modifying templates” rather than “building the platform yourself”
2. Choosing a Development Workflow
The official documentation supports two workflows:
| Workflow | Applicable Scenario | Prerequisites |
|---|---|---|
| Docker Workflow | Users who don’t want to install Sui/Node locally | Docker only |
| Host Workflow | Already have Sui CLI + Node.js | Sui CLI + Node.js |
The Real Trade-offs of These Two Workflows
- Docker More stable, fewer environment differences, suitable for getting it working first
- Host Faster, closer to daily development, but more dependent on your local environment being clean
If your goal is to “understand the complete loop first”, prioritize Docker. If your goal is “high-frequency iteration writing your own code”, you’ll typically gradually transition to Host.
3. Docker Development Environment (Recommended for Beginners)
Quick Start
# Clone the repository
git clone https://github.com/evefrontier/builder-scaffold.git
cd builder-scaffold
# Start the development container (first time will download images, ~2-3 minutes)
cd docker
docker compose run --rm --service-ports sui-dev
On first startup, the container will automatically:
- Create 3 ed25519 key pairs (
ADMIN,PLAYER_A,PLAYER_B) - Start the local Sui node
- Fund accounts with test SUI
Keys are persistently saved in Docker Volume and won’t be lost when the container restarts.
Working Directory Structure Inside Container
/workspace/
├── builder-scaffold/ # Complete repository (synced with host)
└── world-contracts/ # Visible in container after cloning on host
Edit files on the host, run commands in the container — both are synced in real-time.
Why Use -e testnet When Building?
sui move build -e testnet # ← The testnet here is "build environment", not deployment target
The local chain’s chain ID changes every restart and can’t be fixed in Move.toml. -e testnet lets dependency resolution use testnet rules, but actual deployment still goes to the local chain.
The most easily misunderstood part here is conflating “build environment” with “deployment target” as the same thing.
Using -e testnet here doesn’t mean you’re actually deploying to testnet now, but rather tells the builder:
- By which set of rules should dependencies be resolved
- How should package builds be processed according to which environment conventions
If this concept isn’t separated, later when switching between localnet / testnet / mainnet, you’ll be very prone to making incorrect judgments.
Container Common Commands Reference
| Task | Command |
|---|---|
| View all keys | cat /workspace/builder-scaffold/docker/.env.sui |
| Switch to testnet | sui client switch --env testnet |
| Import existing key | sui keytool import <key> ed25519 |
| Compile contract | cd .../smart_gate && sui move build -e testnet |
| Run TS script | cd /workspace/builder-scaffold && pnpm configure-rules |
| Start GraphQL | curl http://localhost:9125/graphql |
| Clear and reset | docker compose down --volumes && docker compose run --rm --service-ports sui-dev |
PostgreSQL + GraphQL Indexer
The Docker environment has built-in Sui indexer and GraphQL support:
# Query chain ID (verify GraphQL startup)
curl -X POST http://localhost:9125/graphql \
-H "Content-Type: application/json" \
-d '{"query": "{ chainIdentifier }"}'
GraphQL endpoint: http://localhost:9125/graphql (can be debugged with Altair)
4. Smart Gate Contract File Structure
move-contracts/smart_gate/
├── Move.toml # Package config (depends on world-contracts)
├── sources/
│ ├── config.move # Shared config foundation: ExtensionConfig + AdminCap + XAuth
│ ├── tribe_permit.move # Example 1: Tribal identity verification pass
│ └── corpse_gate_bounty.move # Example 2: Submit corpse items for pass
└── tests/
└── gate_tests.move # Tests
Move.toml Analysis
[package]
name = "smart_gate"
edition = "2024"
[dependencies]
# Git dependency (recommended to lock stable tag)
world = { git = "https://github.com/evefrontier/world-contracts.git", subdir = "contracts/world", rev = "v0.0.14" }
[addresses]
smart_gate = "0x0" # Automatically replaced with actual address on deployment
Important: It’s recommended to use git dependencies and lock the
rev(e.g.,v0.0.14), don’t trackmain, otherwise breaking changes in the world-contracts main branch will directly affect compilation results.
Why the Scaffold Example Is Best for Learning “Extension Pattern”
Because it’s not an abstract demo, but rather puts several key Builder elements in:
- Dynamic field configuration
- AdminCap management
- Typed Witness extension
- Gate component integration
In other words, smart_gate isn’t teaching you to write a specific business case, but teaching you the core extension skeleton of EVE Builder.
5. config.move: Extension Base Framework
module smart_gate::config;
use sui::dynamic_field as df;
/// Automatically created after deployment, shared storage for all rules
public struct ExtensionConfig has key {
id: UID,
}
/// Admin permission credential (transferred to deployer on init)
public struct AdminCap has key, store {
id: UID,
}
/// Authorization witness type (Typed Witness), passed to gate::issue_jump_permit<XAuth>
public struct XAuth has drop {}
fun init(ctx: &mut TxContext) {
// Transfer AdminCap to deployer
transfer::transfer(AdminCap { id: object::new(ctx) }, ctx.sender());
// Share ExtensionConfig (everyone can read, only AdminCap holders can write)
transfer::share_object(ExtensionConfig { id: object::new(ctx) });
}
Dynamic Field Rules System
ExtensionConfig uses dynamic fields to store various rules, allowing a single config object to support multiple different extension rules simultaneously:
// set_rule: Insert or overwrite rule (value needs drop ability)
public fun set_rule<K: copy + drop + store, V: store + drop>(
config: &mut ExtensionConfig,
_: &AdminCap, // Only AdminCap can set
key: K,
value: V,
) {
if (df::exists_(&config.id, copy key)) {
let _old: V = df::remove(&mut config.id, copy key);
};
df::add(&mut config.id, key, value);
}
6. tribe_permit.move: Tribal Pass (Detailed Reading)
This is the simplest Extension implementation, suitable for understanding the core structure of the extension pattern:
module smart_gate::tribe_permit;
// Rule configuration (dynamic field value)
public struct TribeConfig has drop, store {
tribe: u32, // Allowed tribe ID
expiry_duration_ms: u64, // Pass validity period (milliseconds)
}
// Rule identifier (dynamic field Key)
public struct TribeConfigKey has copy, drop, store {}
Issuing a Pass
pub fun issue_jump_permit(
extension_config: &ExtensionConfig,
source_gate: &Gate,
destination_gate: &Gate,
character: &Character,
clock: &Clock,
ctx: &mut TxContext,
) {
// 1. Read rule configuration
let tribe_cfg = extension_config.borrow_rule<TribeConfigKey, TribeConfig>(TribeConfigKey {});
// 2. Verify character tribe
assert!(character.tribe() == tribe_cfg.tribe, ENotStarterTribe);
// 3. Calculate expiry time (overflow check)
let ts = clock.timestamp_ms();
assert!(ts <= (0xFFFFFFFFFFFFFFFFu64 - tribe_cfg.expiry_duration_ms), EExpiryOverflow);
let expires_at = ts + tribe_cfg.expiry_duration_ms;
// 4. Call world contract to issue JumpPermit NFT
gate::issue_jump_permit<XAuth>(
source_gate, destination_gate, character,
config::x_auth(), // Package-unique XAuth instance
expires_at, ctx,
);
}
Design Detail: Compared to the original in world-contracts, this adds overflow checking (
EExpiryOverflow), making it a more robust production implementation.
Admin Rule Setting
pub fun set_tribe_config(
extension_config: &mut ExtensionConfig,
admin_cap: &AdminCap,
tribe: u32,
expiry_duration_ms: u64,
) {
extension_config.set_rule<TribeConfigKey, TribeConfig>(
admin_cap,
TribeConfigKey {},
TribeConfig { tribe, expiry_duration_ms },
);
}
7. Compilation and Testing
# Enter smart_gate directory
cd move-contracts/smart_gate
# Compile (use testnet as build environment)
sui move build -e testnet
# Run tests
sui move test -e testnet
Common Compilation Failure Issues
| Error Message | Cause | Solution |
|---|---|---|
Unpublished dependencies: World | world-contracts not deployed | Deploy world-contracts first, or switch to local dependency |
Move.lock wrong env | Move.lock recorded environment doesn’t match | rm Move.lock && sui move build -e testnet |
edition = "legacy" warning | Using old version Move | Change to edition = "2024" in Move.toml |
8. Publishing Contract to Local Chain
# Ensure world-contracts is deployed, obtaining its publication file
sui client test-publish \
--build-env testnet \
--pubfile-path ../../deployments/Pub.localnet.toml
# After successful publication, record the output Package ID
# Fill in BUILDER_PACKAGE_ID in .env file
test-publishvspublish:test-publishis Sui’s special publish mode that allows publishing packages with unpublished dependencies on the local chain (for testing). For actual deployment to testnet/mainnet, usesui client publish.
9. Adding Your Own Extension Rules
Using the example of adding a “toll gate rule”:
Step 1: Create a new file toll_gate.move alongside config.move
module smart_gate::toll_gate;
use smart_gate::config::{Self, AdminCap, XAuth, ExtensionConfig};
use sui::coin::{Self, Coin};
use sui::sui::SUI;
use sui::balance::{Self, Balance};
// Rule data
public struct TollConfig has drop, store {
toll_amount: u64,
expiry_duration_ms: u64,
}
public struct TollConfigKey has copy, drop, store {}
// Fee ledger (shared object)
public struct TollVault has key {
id: UID,
balance: Balance<SUI>,
}
// Create vault on initialization
public fun create_vault(ctx: &mut TxContext) {
transfer::share_object(TollVault {
id: object::new(ctx),
balance: balance::zero(),
});
}
Step 2: Implement Issuance Function
pub fun pay_and_jump(
extension_config: &ExtensionConfig,
vault: &mut TollVault,
source_gate: &Gate,
destination_gate: &Gate,
character: &Character,
mut payment: Coin<SUI>,
clock: &Clock,
ctx: &mut TxContext,
) {
let toll_cfg = extension_config.borrow_rule<TollConfigKey, TollConfig>(TollConfigKey {});
assert!(coin::value(&payment) >= toll_cfg.toll_amount, ETollInsufficient);
let toll = coin::split(&mut payment, toll_cfg.toll_amount, ctx);
balance::join(&mut vault.balance, coin::into_balance(toll));
if (coin::value(&payment) > 0) {
transfer::public_transfer(payment, ctx.sender());
} else {
coin::destroy_zero(payment);
};
let expires = clock.timestamp_ms() + toll_cfg.expiry_duration_ms;
gate::issue_jump_permit<XAuth>(
source_gate, destination_gate, character, config::x_auth(), expires, ctx,
);
}
const ETollInsufficient: u64 = 0;
Chapter Summary
| Component | Purpose |
|---|---|
docker/compose.yml | One-click startup for local Sui chain + GraphQL indexer |
move-contracts/smart_gate/ | Gate Extension main template |
config.move | ExtensionConfig + AdminCap + XAuth base framework |
tribe_permit.move | Example ①: Tribal identity verification |
corpse_gate_bounty.move | Example ②: Item consumption for pass |
-e testnet build flag | Solves local chain chain ID instability problem |
Next Chapter: TypeScript Scripts and dApp Development — After contract deployment, how to interact with on-chain contracts using 6 ready-made scripts, and how to build an EVE Frontier frontend based on the dApp template.
Chapter 7: Complete Guide to Builder Scaffold (Part 2) — TS Scripts and dApp Development
Learning Objectives: Master the usage and principles of the 6 interaction scripts in
ts-scripts/, understand thehelper.tstoolchain, and learn to build your own EVE Frontier dApp based on thedapps/React template.
Status: Mapped scripts and dApp directories. This text is based on the script layout in the
builder-scaffoldwithin this repository.
Minimal Call Chain
Read .env -> helper.ts initializes client/object ID -> TS script initiates PTB -> On-chain object changes -> dApp queries and displays new state
Directory Responsibility Boundaries
To use builder-scaffold smoothly, the key isn’t memorizing every script name, but first distinguishing the three layers of responsibilities:
| Directory/File | Responsibility | Should NOT Do |
|---|---|---|
ts-scripts/smart_gate/* | Organize individual business actions, assemble PTB | Cram lots of shared utility functions |
ts-scripts/utils/helper.ts | Initialize client, read environment, encapsulate common queries | Write specific business rules |
dapps/src/* | Display state, initiate interactions, handle wallet connections | Directly hardcode environment and object IDs |
What Should a Complete Script Chain Look Like?
.env
-> helper.ts reads network / package id / key
-> Business script assembles PTB
-> Submit on-chain transaction
-> dApp or query script refreshes object state
If a script is simultaneously responsible for “reading config + querying objects + assembling complex business rules + printing UI text”, it should basically be split up.
What the script system really needs to solve isn’t just “automating command lines”, but separating responsibilities clearly:
- Where does configuration come from
- Who is responsible for common queries
- Who organizes individual business actions
- How do frontend and scripts share the same object understanding
Two Common Anti-patterns
helper.tskeeps growing until it becomes an unmaintainable “god file”- Frontend directly copies object IDs and network configs from scripts, causing scripts and pages to drift over time
Corresponding Code Directories
1. Prerequisites for TypeScript Scripts
Before running any script, you need to complete the following preparation:
Prerequisites:
1. ✅ world-contracts deployed (local or testnet)
2. ✅ smart_gate contract deployed (execute sui client publish)
3. ✅ .env file filled with all necessary environment variables
4. ✅ test-resources.json + extracted-object-ids.json exist in project root
Configure .env File
cp .env.example .env
Key environment variables:
# Network selection
NETWORK=localnet # localnet | testnet | mainnet
# Admin private key (exported Sui key, 0x prefixed Bech32 format)
ADMIN_EXPORTED_KEY=suiprivkey1...
# Contract addresses
WORLD_PACKAGE_ID=0xabc... # Package ID after world-contracts deployment
BUILDER_PACKAGE_ID=0xdef... # Package ID after smart_gate deployment
# Tenant name (game world namespace)
TENANT=evefrontier
The Essence of .env Isn’t a Config Table, But Engineering Boundaries
As long as a value changes depending on the environment, it shouldn’t be scattered throughout script bodies.
The most common drifting values include:
- Network
- Package IDs
- Admin keys
- Tenant names
- Key object IDs
Once these things are written separately in scripts, frontend, and tests, troubleshooting later will be very painful.
2. Execution Order and Functions of the 6 Scripts
Complete Execution Flow
① pnpm configure-rules → Set Gate extension rules (tribe ID, bounty item type_id)
② pnpm authorise-gate → Register extension to Gate object
③ pnpm authorise-storage-unit → Register extension to StorageUnit
④ pnpm issue-tribe-jump-permit → Issue pass for characters meeting tribal conditions
⑤ pnpm jump-with-permit → Jump with pass
⑥ pnpm collect-corpse-bounty → Submit corpse items → Receive pass (bounty flow)
3. Detailed Reading: configure-rules.ts
This is the most frequently modified script, responsible for initializing two types of rules:
// ts-scripts/smart_gate/configure-rules.ts
import { Transaction } from "@mysten/sui/transactions";
import { getEnvConfig, initializeContext, hydrateWorldConfig } from "../utils/helper";
import { resolveSmartGateExtensionIds } from "./extension-ids";
async function main() {
// 1. Read .env config
const env = getEnvConfig();
// 2. Initialize Sui client + keypair
const ctx = initializeContext(env.network, env.adminExportedKey);
const { client, keypair, address } = ctx;
// 3. Read world-contracts config from chain
await hydrateWorldConfig(ctx);
// 4. Query AdminCap, ExtensionConfig object IDs from chain
const { builderPackageId, adminCapId, extensionConfigId } =
await resolveSmartGateExtensionIds(client, address);
const tx = new Transaction();
// 5. Set tribe rule (tribe=100, valid for 1 hour)
tx.moveCall({
target: `${builderPackageId}::tribe_permit::set_tribe_config`,
arguments: [
tx.object(extensionConfigId),
tx.object(adminCapId),
tx.pure.u32(100), // Allowed tribe ID
tx.pure.u64(3600000), // Validity: 1 hour (milliseconds)
],
});
// 6. Set bounty rule (item type_id=ITEM_A_TYPE_ID, valid for 1 hour)
tx.moveCall({
target: `${builderPackageId}::corpse_gate_bounty::set_bounty_config`,
arguments: [
tx.object(extensionConfigId),
tx.object(adminCapId),
tx.pure.u64(ITEM_A_TYPE_ID), // Corpse item's type_id
tx.pure.u64(3600000),
],
});
// 7. Submit transaction
const result = await client.signAndExecuteTransaction({
transaction: tx,
signer: keypair,
options: { showEffects: true, showObjectChanges: true },
});
console.log("Transaction digest:", result.digest);
}
What Structure Is Most Worth Keeping in This Type of Script?
It’s this clear chain:
- Read environment
- Initialize context
- Resolve key on-chain objects
- Assemble PTB
- Submit and record digest
As long as you maintain this skeleton when adding new scripts in the future, the engineering will be much more stable.
Modifying Rule Parameters
Common modification points:
// Change to allow tribe ID = 3 (corresponding to your game world's tribe config)
tx.pure.u32(3),
// Change to 24-hour validity period
tx.pure.u64(24 * 60 * 60 * 1000),
// ITEM_A_TYPE_ID is defined in utils/constants.ts, adjust according to actual items
4. Utility Function Analysis: utils/helper.ts
This is the shared base component for all scripts:
import { getEnvConfig, initializeContext, hydrateWorldConfig } from "../utils/helper";
// getEnvConfig(): Read .env and validate necessary fields
const env = getEnvConfig();
// → { network, rpcUrl, packageId, adminExportedKey, tenant }
// initializeContext(): Create Sui RPC client and Ed25519 keypair
const ctx = initializeContext(env.network, env.adminExportedKey);
// → { client, keypair, address, config, network }
// hydrateWorldConfig(): Read world config from chain (ObjectRegistry, AdminACL and other object IDs)
await hydrateWorldConfig(ctx);
// Afterwards can access all world object IDs via ctx.config
Key Utilities
utils/
├── helper.ts # Environment config, context initialization, world config reading
├── config.ts # Network types, WorldConfig interface, RPC URL mapping
├── constants.ts # TENANT, ITEM_A_TYPE_ID and other constants
├── derive-object-id.ts # Derive Sui object ID from game item_id (deterministic)
└── proof.ts # Generate LocationProof (for location verification testing)
Why Is helper.ts Both Important and Dangerous?
Because it naturally becomes the central file that all scripts depend on.
Important because:
- It unifies network, client, and config reading
- It reduces duplicate code
Dangerous because:
- It can easily expand infinitely
- Eventually absorbing a bunch of business logic too
So a more stable principle is: helper.ts should only do “common infrastructure”, not “specific business strategy”.
5. resolve-extension-ids.ts: Automatically Query Object IDs
// No need to manually query object IDs! Script will automatically find AdminCap and ExtensionConfig from chain
export async function resolveSmartGateExtensionIds(client, ownerAddress) {
// Find AdminCap object belonging to ownerAddress
const adminCapId = await findObjectByType(
client,
ownerAddress,
`${builderPackageId}::config::AdminCap`,
);
// Find shared ExtensionConfig object
const extensionConfigId = await findSharedObjectByType(
client,
`${builderPackageId}::config::ExtensionConfig`,
);
return { builderPackageId, adminCapId, extensionConfigId };
}
6. Adding Scripts for Custom Contracts
Using the toll_gate example from Chapter 6, add a configure-toll.ts:
// ts-scripts/smart_gate/configure-toll.ts
import "dotenv/config";
import { Transaction } from "@mysten/sui/transactions";
import { getEnvConfig, initializeContext, hydrateWorldConfig } from "../utils/helper";
import { resolveSmartGateExtensionIds } from "./extension-ids";
async function main() {
const env = getEnvConfig();
const ctx = initializeContext(env.network, env.adminExportedKey);
await hydrateWorldConfig(ctx);
const { client, keypair } = ctx;
const { builderPackageId, adminCapId, extensionConfigId } =
await resolveSmartGateExtensionIds(client, ctx.address);
const tx = new Transaction();
tx.moveCall({
target: `${builderPackageId}::toll_gate::set_toll_config`,
arguments: [
tx.object(extensionConfigId),
tx.object(adminCapId),
tx.pure.u64(1_000_000_000), // Toll: 1 SUI = 10^9 MIST
tx.pure.u64(3600000), // Valid for 1 hour
],
});
const result = await client.signAndExecuteTransaction({
transaction: tx,
signer: keypair,
options: { showEffects: true },
});
console.log("Toll config set! Digest:", result.digest);
}
main();
Then add to package.json:
"scripts": {
"configure-toll": "tsx ts-scripts/smart_gate/configure-toll.ts"
}
7. dApp Template: Quick Start
cd dapps
pnpm install
cp .envsample .env # Fill in VITE_ITEM_ID and other variables
pnpm dev # Start dev server: http://localhost:5173
Tech Stack
| Library | Version | Purpose |
|---|---|---|
| React + TypeScript | 18 | UI framework |
| Vite | 5 | Build tool |
| Radix UI | 1 | UI component library |
@evefrontier/dapp-kit | latest | EVE Frontier dedicated SDK |
@mysten/dapp-kit-react | latest | Sui wallet connection |
Provider Architecture (main.tsx)
// src/main.tsx
ReactDOM.createRoot(document.getElementById("root")!).render(
<EveFrontierProvider queryClient={queryClient}>
{/* One Provider combines all necessary Contexts */}
{/* QueryClientProvider → DAppKitProvider → VaultProvider → SmartObjectProvider → NotificationProvider */}
<App />
</EveFrontierProvider>,
);
8. Core Hooks Reference
Wallet Connection (App.tsx)
import { abbreviateAddress, useConnection } from "@evefrontier/dapp-kit";
import { useCurrentAccount } from "@mysten/dapp-kit-react";
// Connect/disconnect wallet
const { handleConnect, handleDisconnect, isConnected, walletAddress } = useConnection();
// Read current account
const account = useCurrentAccount();
// Display abbreviated address (e.g., 0x1234...5678)
<span>{abbreviateAddress(account?.address ?? "")}</span>
Read Smart Object (Assembly Data)
import { useSmartObject } from "@evefrontier/dapp-kit";
// Pass in-game item_id (from URL params or env)
const { assembly, character, loading, error, refetch } = useSmartObject({
itemId: VITE_ITEM_ID,
});
// assembly contains: name, typeId, state, id, owner character
// character contains: holder character info
Execute Transaction (WalletStatus.tsx)
import { useDAppKit } from "@mysten/dapp-kit-react";
import { Transaction } from "@mysten/sui/transactions";
const { signAndExecuteTransaction } = useDAppKit();
async function callMyContract() {
const tx = new Transaction();
tx.moveCall({
target: `${PACKAGE_ID}::tribe_permit::issue_jump_permit`,
arguments: [/* ... */],
});
const result = await signAndExecuteTransaction({ transaction: tx });
await refetch(); // Refresh assembly state
}
9. Practice: Issue Tribal Pass in dApp
// src/components/IssuePermit.tsx
import { useSmartObject, useConnection } from "@evefrontier/dapp-kit";
import { useDAppKit } from "@mysten/dapp-kit-react";
import { Transaction } from "@mysten/sui/transactions";
export function IssuePermit({ gateItemId }: { gateItemId: string }) {
const { assembly } = useSmartObject({ itemId: gateItemId });
const { isConnected } = useConnection();
const { signAndExecuteTransaction } = useDAppKit();
const handleIssuePermit = async () => {
const tx = new Transaction();
tx.moveCall({
target: `${import.meta.env.VITE_BUILDER_PACKAGE_ID}::tribe_permit::issue_jump_permit`,
arguments: [
tx.object(import.meta.env.VITE_EXTENSION_CONFIG_ID),
tx.object(SOURCE_GATE_ID),
tx.object(DEST_GATE_ID),
tx.object(CHARACTER_ID),
tx.object("0x6"), // Clock object (Sui system object fixed ID)
],
});
const result = await signAndExecuteTransaction({ transaction: tx });
console.log("JumpPermit issued!", result.digest);
};
return (
<button
onClick={handleIssuePermit}
disabled={!isConnected || !assembly}
>
{assembly ? `Apply for ${assembly.name}` : "Loading..."}
</button>
);
}
10. Sponsored Transactions (Sponsored TX)
For Builders wanting to hide gas fees, dapp-kit supports sponsored transactions:
import { useSponsoredTransaction } from "@evefrontier/dapp-kit";
const { sponsoredSignAndExecute } = useSponsoredTransaction();
// Player doesn't need to pay gas — Builder's server pays for them
await sponsoredSignAndExecute({ transaction: tx });
// Note: Only EVE Vault wallet supports this feature
// If user uses other wallets, need to catch WalletSponsoredTransactionNotSupportedError
11. GraphQL Data Query (Advanced)
When useSmartObject isn’t sufficient, you can use GraphQL directly:
import { executeGraphQLQuery, getAssemblyWithOwner } from "@evefrontier/dapp-kit";
// Query Gate's complete data (including owner character)
const gateData = await getAssemblyWithOwner({ itemId: gateItemId });
// Execute custom GraphQL query
const result = await executeGraphQLQuery(`
query GetMyGates($owner: SuiAddress!) {
objects(filter: { type: "${PACKAGE_ID}::smart_gate::Gate", owner: $owner }) {
nodes {
address
contents { json }
}
}
}
`, { owner: address });
12. Complete Project Setup Flow Summary
1. Clone builder-scaffold
2. Clone world-contracts (Docker users: on host, automatically visible in container)
3. Choose workflow: Docker or Host
4. Start local chain (docker compose run or sui start)
5. Deploy world-contracts (refer to docs/builder-flow-docker.md)
6. Compile smart_gate: sui move build -e testnet
7. Deploy smart_gate: sui client test-publish --pubfile-path ...
8. Fill .env file (BUILDER_PACKAGE_ID + WORLD_PACKAGE_ID + ADMIN_KEY)
9. Run pnpm configure-rules → pnpm authorise-gate → pnpm issue-tribe-jump-permit
10. Start dApp: cd dapps && pnpm dev
Chapter Summary
| Component | Purpose |
|---|---|
configure-rules | Set tribe + bounty config rules |
authorise-gate | Register XAuth to target Gate |
issue-tribe-jump-permit | Issue JumpPermit for qualified players |
utils/helper.ts | Environment variables, Sui client, world config initialization |
EveFrontierProvider | Uniformly wraps all React Contexts |
useSmartObject | Core Hook for reading on-chain Assembly data |
useSponsoredTransaction | Sponsored transactions paying Gas for players |
These two chapters cover the complete chain from local setup to contract deployment, script interaction, and frontend development for Builder Scaffold. Combined with previous World contract chapters, you now have all the knowledge to independently build an end-to-end EVE Frontier Builder application.
Chapter 8: Sponsored Transactions & Server-Side Integration
Goal: Deeply understand EVE Frontier’s sponsored transaction mechanism, master how to build backend services for business logic validation and Gas payment on behalf of players, achieving frictionless gameplay experiences.
Status: Engineering chapter. Main content focuses on sponsored transactions, server-side validation, and on-chain/off-chain coordination.
8.1 What are Sponsored Transactions?
In regular Sui transactions, the sender and gas owner are the same person. Sponsored transactions allow these two roles to be separated:
Regular transaction: Player signs + Player pays Gas
Sponsored transaction: Player signs intent + Server validates + Server pays Gas
Critical for EVE Frontier because:
- Certain operations require game server validation (such as proximity proofs, distance checks)
- Lowers player entry barrier (no need to pre-fund SUI for Gas)
- Enables business-level risk control: Server can reject invalid requests
The real key here isn’t simply “who pays Gas for whom,” but rather:
Sponsored transactions break down a player action into three stages: “user intent + server review + on-chain execution.”
This makes many product experiences possible that were previously very difficult:
- Players don’t need to prepare SUI in advance
- Server can make business judgments before going on-chain
- Risk control can happen before signing, rather than remedying after asset incidents
But the cost is also clear: your system is no longer just frontend + contract, but formally becomes a “on-chain/off-chain coordinated system.”
8.2 AdminACL: Game Server’s Permission Object
EVE Frontier uses the AdminACL shared object to manage which server addresses are authorized as sponsors:
GovernorCap
└──(manages) AdminACL (shared object)
└── sponsors: vector<address>
├── Game Server 1 address
├── Game Server 2 address
└── ...
Operations requiring server participation (like jumping) have checks like this in the contract:
public fun verify_sponsor(admin_acl: &AdminACL, ctx: &TxContext) {
// tx_context::sponsor() returns the Gas payer's address
let sponsor = ctx.sponsor().unwrap(); // aborts if no sponsor
assert!(
vector::contains(&admin_acl.sponsors, &sponsor),
EUnauthorizedSponsor,
);
}
This means: even if a player constructs a valid transaction themselves, calling functions like jump_with_permit will abort without an authorized server signature.
What does AdminACL really express?
It doesn’t express “this server can technically sign,” but rather:
This server is officially trusted by the world rules to vouch for certain sensitive actions.
This is fundamentally different from regular backend services. In many Web applications, the backend just helps you make business judgments; here, the backend is part of the on-chain permission model itself.
So once AdminACL management becomes chaotic, it affects not a single interface, but the entire chain of trust:
- Who can sponsor payments
- Who can vouch for proximity proofs
- Who can initiate certain restricted actions
8.3 Complete Sponsored Transaction Flow
Player Your Backend Service Sui Network
│ │ │
│── 1. Build Transaction ──►│ │
│ (setSender = player addr)│ │
│ │ │
│◄── 2. Backend validates ──│ │
│ (check proximity, balance, etc.) │
│ │ │
│── 3. Player signs (Sender)──►│ │
│ │ │
│ │── 4. Server signs (Gas) ───►│
│ │ (setGasOwner = server) │
│ │ │
│◄─────────────────────────┼── 5. Transaction result ───│
What does each segment in this chain protect against?
- Player builds transaction Prevents server from arbitrarily fabricating intent on user’s behalf
- Backend validates business logic Prevents requests that don’t meet conditions from going directly on-chain
- Player signature Proves this is indeed a user-authorized action
- Server signature Proves the platform is willing to sponsor and vouch for this action
All four segments are indispensable. Missing one leads to typical problems:
- No player signature: platform can send on behalf of users arbitrarily
- No backend validation: anyone can freeload sponsorship
- No server signature: restricted on-chain entry points fail directly
8.4 Building a Simple Backend Sponsorship Service
Project Structure
backend/
├── src/
│ ├── server.ts # Express server
│ ├── sponsor.ts # Sponsorship transaction logic
│ ├── validators.ts # Business validation
│ └── config.ts # Configuration
└── package.json
sponsor.ts: Core Sponsorship Logic
// src/sponsor.ts
import { SuiClient } from "@mysten/sui/client";
import { Ed25519Keypair } from "@mysten/sui/keypairs/ed25519";
import { Transaction } from "@mysten/sui/transactions";
import { fromBase64 } from "@mysten/sui/utils";
const client = new SuiClient({
url: process.env.SUI_RPC_URL ?? "https://fullnode.testnet.sui.io:443",
});
// Server signing key (securely stored in environment variables)
const serverKeypair = Ed25519Keypair.fromSecretKey(
fromBase64(process.env.SERVER_PRIVATE_KEY!)
);
export interface SponsoredTxRequest {
txBytes: string; // Player-built transaction (base64)
playerSignature: string; // Player's signature on txBytes (base64)
playerAddress: string;
}
export async function sponsorAndExecute(req: SponsoredTxRequest) {
// 1. Deserialize player's transaction
const txBytes = fromBase64(req.txBytes);
// 2. Server sets Gas payer
// This modifies the transaction to make the server address the Gas payer
const tx = Transaction.from(txBytes);
tx.setGasOwner(serverKeypair.getPublicKey().toSuiAddress());
// 3. Server signs (as Gas payer)
const sponsoredBytes = await tx.build({ client });
const serverSig = await serverKeypair.signTransaction(sponsoredBytes);
// 4. Execute: submit both player signature and server signature
const result = await client.executeTransactionBlock({
transactionBlock: sponsoredBytes,
signature: [
req.playerSignature, // Player's signature as Sender
serverSig.signature, // Server's signature as Gas Owner
],
options: { showEvents: true, showEffects: true },
});
return result;
}
What the server needs to guard against most isn’t “request failure” but “request abuse”
A truly usable sponsorship service should at least consider these risk control points:
- Same player repeating requests in short time
- Same transaction being submitted repeatedly
- Certain high-cost operations being batch-scraped
- Players sneaking in transactions that shouldn’t be sponsored
So in real projects, sponsorship services usually also add:
- Request rate limiting
- Transaction whitelists or entry whitelists
- Budget limits per action
- Request logging and audit trails
validators.ts: Business Validation Logic
// src/validators.ts
import { SuiClient } from "@mysten/sui/client";
const client = new SuiClient({ url: process.env.SUI_RPC_URL! });
// Validate proximity (simplified: check if two components' game coordinates are close enough)
export async function validateProximity(
playerAddress: string,
assemblyId: string,
): Promise<boolean> {
// In real scenarios, this would query game server or on-chain location hash
// This is just an example implementation
try {
const assembly = await client.getObject({
id: assemblyId,
options: { showContent: true },
});
// Check if player is near component (game physics rule validation)
// Real implementation needs to communicate with game server
return true; // Simplified
} catch {
return false;
}
}
// Validate if player meets conditions (e.g., holds specific NFT)
export async function validatePlayerCondition(
playerAddress: string,
requiredNftType: string,
): Promise<boolean> {
const objects = await client.getOwnedObjects({
owner: playerAddress,
filter: { StructType: requiredNftType },
});
return objects.data.length > 0;
}
Why shouldn’t validation logic be mixed with execution logic?
Because these two things change at different rates:
- Validation rules iterate frequently
- Execution pathways need to remain as stable as possible
By separating them, you get several direct benefits:
- Risk control rules are easier to update independently
- Easier to compose different validators for different actions
- Easier to do gradual rollouts and replay analysis
server.ts: REST API Server
// src/server.ts
import express from "express";
import { sponsorAndExecute, SponsoredTxRequest } from "./sponsor";
import { validateProximity, validatePlayerCondition } from "./validators";
const app = express();
app.use(express.json());
// Sponsor jump request
app.post("/api/sponsor/jump", async (req, res) => {
const { txBytes, playerSignature, playerAddress, gateId } = req.body;
try {
// 1. Validate proximity (player must be near stargate)
const isNear = await validateProximity(playerAddress, gateId);
if (!isNear) {
return res.status(400).json({ error: "Player not near stargate" });
}
// 2. Execute sponsored transaction
const result = await sponsorAndExecute({
txBytes,
playerSignature,
playerAddress,
});
res.json({ success: true, digest: result.digest });
} catch (err: any) {
res.status(500).json({ error: err.message });
}
});
// Sponsor general action (with custom validation)
app.post("/api/sponsor/action", async (req, res) => {
const { txBytes, playerSignature, playerAddress, actionType, metadata } = req.body;
try {
// Different validation based on actionType
switch (actionType) {
case "deposit_ore": {
// Validate near storage box
const ok = await validateProximity(playerAddress, metadata.ssuId);
if (!ok) return res.status(400).json({ error: "Not nearby" });
break;
}
case "special_gate": {
// Validate holding VIP NFT
const hasNft = await validatePlayerCondition(
playerAddress,
`${process.env.MY_PACKAGE}::vip_pass::VipPass`
);
if (!hasNft) return res.status(403).json({ error: "VIP pass required" });
break;
}
}
const result = await sponsorAndExecute({ txBytes, playerSignature, playerAddress });
res.json({ success: true, digest: result.digest });
} catch (err: any) {
res.status(500).json({ error: err.message });
}
});
app.listen(3001, () => console.log("Sponsorship service running on :3001"));
Idempotency is the most easily overlooked issue in sponsorship services
Player network jitter, frontend retries, users frantically clicking buttons—all can cause the same request to be sent multiple times.
If your backend doesn’t have idempotency design, you’ll see:
- Same business request being sponsored repeatedly
- Users think they clicked once, but two transactions went on-chain
- Budgets and statistics all become distorted
In real projects, you should at least give each business action a stable request ID and record server-side whether “this request has already been processed.”
8.5 Frontend Integration with Sponsored Transactions
// src/hooks/useSponsoredAction.ts
import { useWallet } from "@mysten/dapp-kit-react";
import { Transaction } from "@mysten/sui/transactions";
import { toBase64 } from "@mysten/sui/utils";
const BACKEND_URL = import.meta.env.VITE_BACKEND_URL ?? "http://localhost:3001";
export function useSponsoredAction() {
const wallet = useWallet();
const executeSponsoredJump = async (
tx: Transaction,
gateId: string,
) => {
if (!wallet.currentAccount) throw new Error("Please connect wallet");
const playerAddress = wallet.currentAccount.address;
// 1. Player only signs, doesn't submit
const txBytes = await tx.build({ client: suiClient });
const { signature: playerSig } = await wallet.signTransaction({
transaction: tx,
});
// 2. Send to backend, let server validate and sponsor Gas
const response = await fetch(`${BACKEND_URL}/api/sponsor/jump`, {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
txBytes: toBase64(txBytes),
playerSignature: playerSig,
playerAddress,
gateId,
}),
});
if (!response.ok) {
const { error } = await response.json();
throw new Error(error);
}
return response.json();
};
return { executeSponsoredJump };
}
8.6 Security Considerations for Sponsored Transactions
| Risk | Defense Measures |
|---|---|
| Server private key leak | Use HSM or KMS to store private keys; rotate regularly |
| Malicious players replaying transactions | Sui’s TransactionDigest is unique, cannot be replayed |
| DDoS attacks on backend | Rate limiting + IP blocking + require player auth |
| Bypassing validation to submit directly | On-chain contract’s verify_sponsor enforces authorized address requirement |
| Gas depletion | Monitor server account balance, set alert thresholds |
8.7 @evefrontier/dapp-kit Built-in Sponsorship Support
The official SDK has built-in support for sponsored transactions:
import { signAndExecuteSponsoredTransaction } from "@evefrontier/dapp-kit";
// SDK automatically communicates with EVE Frontier backend to complete sponsorship
const result = await signAndExecuteSponsoredTransaction({
transaction: tx,
// No need to manually handle signatures and backend communication
});
Applicable scenarios: Official game operations (like component online/offline, warehouse transfers) can typically use the official sponsorship service.
When you need to build your own backend: When your extension contracts need custom business validation (like checking NFT holdings, in-game conditions), you need to deploy your own sponsorship service.
Summary
| Knowledge Point | Core Concept |
|---|---|
| Sponsored transaction essence | Sender (player) and Gas Owner (server) are separated |
| AdminACL | Game contract verifies ctx.sponsor() must be in authorized list |
| Backend service responsibilities | Business validation + server signature + merged signature submission |
| Security essentials | Private key protection + Rate Limiting + contract-level safeguards |
| SDK support | signAndExecuteSponsoredTransaction() handles official scenarios |
Further Reading
Chapter 9: Off-chain Indexing & GraphQL Advanced Usage
Goal: Master the complete toolkit for off-chain data querying, including GraphQL, gRPC, event subscriptions, and custom indexers, to build high-performance data-driven dApps.
Status: Engineering chapter. Main content focuses on GraphQL, events, and indexer design.
9.1 Read-Write Separation Principle
The golden rule of EVE Frontier development:
Write operations (modify on-chain state) → Submit via Transaction → Consume Gas
Read operations (query on-chain state) → Via GraphQL/gRPC/SuiClient → Completely free
Design Guidance: Move all possible logic to off-chain reads, and only submit transactions when you truly need to change state.
This principle seems simple, but it actually determines your entire system cost structure:
- The more you write on-chain, the higher the Gas and the greater the failure surface
- The better you read off-chain, the faster the frontend and the lighter the interaction
So a mature Builder system typically doesn’t “stuff everything on-chain,” but clearly divides into three layers:
- On-chain objects Store state that must be trustworthy
- On-chain events Store actions that have occurred
- Off-chain indexes Store the views that the frontend actually needs to consume
If these three layers aren’t separated, your frontend will eventually become a bunch of expensive and hard-to-maintain real-time RPC calls.
9.2 SuiClient Basic Reading
import { SuiClient } from "@mysten/sui/client";
const client = new SuiClient({ url: "https://fullnode.testnet.sui.io:443" });
// ❶ Read a single object
const gate = await client.getObject({
id: "0x...",
options: { showContent: true, showOwner: true, showType: true },
});
console.log(gate.data?.content);
// ❲ Batch read multiple objects (one request)
const objects = await client.multiGetObjects({
ids: ["0x...gate1", "0x...gate2", "0x...ssu"],
options: { showContent: true },
});
// ❸ Query all objects owned by an address
const ownedObjects = await client.getOwnedObjects({
owner: "0xALICE",
filter: { StructType: `${WORLD_PKG}::gate::Gate` },
options: { showContent: true },
});
// ❹ Paginated query (handling large amounts of data)
let cursor: string | null = null;
const allGates: any[] = [];
do {
const page = await client.getOwnedObjects({
owner: "0xALICE",
cursor,
limit: 50,
});
allGates.push(...page.data);
cursor = page.nextCursor ?? null;
} while (cursor);
What is SuiClient best suited for?
It’s best suited for:
- Single object reads
- Small-scale batch reads
- Debugging and script validation
- Lightweight queries for the frontend
It may not be directly suitable for:
- Large-scale leaderboards
- Aggregated views across multiple object types
- High-frequency complex filtering
Once your query needs start requiring “sorting, aggregation, joining across objects,” it’s time to consider GraphQL or a custom indexing layer.
9.3 Deep GraphQL Usage
Sui’s GraphQL interface is more powerful than JSON-RPC, supporting complex filtering, nested queries, and cursor pagination.
Connecting to GraphQL
import { SuiGraphQLClient, graphql } from "@mysten/sui/graphql";
const graphqlClient = new SuiGraphQLClient({
url: "https://graphql.testnet.sui.io/graphql",
});
Query all objects of a certain type
const GET_ALL_GATES = graphql(`
query GetAllGates($type: String!, $after: String) {
objects(filter: { type: $type }, first: 50, after: $after) {
pageInfo {
hasNextPage
endCursor
}
nodes {
address
asMoveObject {
contents {
json # Return fields in JSON format
}
}
}
}
}
`);
async function getAllGates(): Promise<any[]> {
const results: any[] = [];
let after: string | null = null;
do {
const data = await graphqlClient.query({
query: GET_ALL_GATES,
variables: {
type: `${WORLD_PKG}::gate::Gate`,
after,
},
});
const objects = data.data?.objects;
if (!objects) break;
results.push(...objects.nodes.map(n => n.asMoveObject?.contents?.json));
after = objects.pageInfo.hasNextPage ? objects.pageInfo.endCursor : null;
} while (after);
return results;
}
The real value of GraphQL isn’t just “more elegant syntax”
Its more important value is allowing you to organize queries according to frontend views, rather than being led by single-object RPC interfaces.
This is very important in actual products, because pages often don’t need “what a certain object originally looks like,” but rather:
- Current object + associated object summary
- A page list + pagination information
- Multiple object types combined into one dashboard
GraphQL is not omnipotent either
If you treat it like a database and fetch data without limits, you’ll still run into problems:
- Queries too large, frontend first screen becomes slow
- Too many nested objects in one page, debugging becomes difficult
- Complex queries change once, both frontend and backend explode together
So the best use of GraphQL is usually:
- Split queries by page
- Each query serves only one clear view type
- When aggregation statistics are needed, let custom indexers take on more responsibility
Query multiple related objects (nested)
// Query star gate and its associated network node information
const GET_GATE_WITH_NODE = graphql(`
query GetGateWithNode($gateId: SuiAddress!) {
object(address: $gateId) {
address
asMoveObject {
contents { json }
}
}
}
`);
// Batch: query multiple different types at once
const GET_ASSEMBLY_OVERVIEW = graphql(`
query AssemblyOverview($gateId: SuiAddress!, $ssuId: SuiAddress!) {
gate: object(address: $gateId) {
asMoveObject { contents { json } }
}
ssu: object(address: $ssuId) {
asMoveObject { contents { json } }
}
}
`);
Query by dynamic field (Table content)
// Query specific entry in Market's listings Table
const GET_LISTING = graphql(`
query GetListing($marketId: SuiAddress!, $typeId: String!) {
object(address: $marketId) {
dynamicField(name: { type: "u64", bcs: $typeId }) {
value {
... on MoveValue {
json
}
}
}
}
}
`);
Why are dynamic field queries more troublesome than normal object fields?
Because dynamic fields are naturally closer to “index structures that grow at runtime” rather than fixed schemas.
This means:
- You must be very clear about the key encoding method
- Frontend and indexing layer must use the same key rules
- Once the key design changes, the read path will fail entirely
So the design of dynamic fields is not just an internal contract issue, it will directly spill over to the query and frontend layers.
9.4 Real-time Event Subscription
import { SuiClient } from "@mysten/sui/client";
const client = new SuiClient({ url: "https://fullnode.testnet.sui.io:443" });
// Subscribe to all events from a specific package
const unsubscribe = await client.subscribeEvent({
filter: { Package: MY_PACKAGE },
onMessage: (event) => {
switch (event.type) {
case `${MY_PACKAGE}::toll_gate_ext::GateJumped`:
handleGateJump(event.parsedJson);
break;
case `${MY_PACKAGE}::market::ItemSold`:
handleItemSold(event.parsedJson);
break;
}
},
});
// Unsubscribe after 90 seconds
setTimeout(unsubscribe, 90_000);
// Query historical events (with filtering)
const history = await client.queryEvents({
query: {
And: [
{ MoveEventType: `${MY_PACKAGE}::toll_gate_ext::GateJumped` },
{ Sender: "0xPlayerAddress..." },
],
},
order: "descending",
limit: 100,
});
What problems are event subscriptions best suited to solve?
Best suited for:
- Real-time notifications
- Activity streams
- Lightweight incremental updates
- Indexer consuming new transactions
Not suitable as:
- The sole source of current state
- Complete business list interface
- Highly reliable historical database
Because event streams naturally have two real-world issues:
- You might disconnect and miss messages
- You always need a historical backfill mechanism
So mature indexers usually:
- First replay history
- Then subscribe to incremental changes
- Periodically perform consistency checks
9.5 gRPC: High-throughput Data Streams
For scenarios requiring processing large amounts of real-time data (such as leaderboards, full network state snapshots), gRPC is more efficient than GraphQL:
// Use gRPC to stream read latest Checkpoints
import { SuiHTTPTransport } from "@mysten/sui/client";
// gRPC is suitable for monitoring state changes across the entire chain
// For example: each Checkpoint contains a summary of all transactions during that period
// Advanced usage: used when building custom indexers
When is it worth using gRPC instead of continuing to pile on RPC / GraphQL?
When you start encountering these scenarios:
- Need to consume checkpoints long-term
- Need to maintain your own near real-time index
- Need high-throughput, low-latency on-chain data streams
If you’re just building a regular dApp page, you usually don’t need to start with gRPC. It’s more like an “infrastructure building tool,” not a page query tool.
9.6 Building Custom Off-chain Indexers
For complex query needs (such as leaderboards, aggregate statistics), you can build your own indexing service:
// server/indexer.ts
import { SuiClient } from "@mysten/sui/client";
const client = new SuiClient({ url: process.env.SUI_RPC! });
// In-memory index (small scale; use Redis or PostgreSQL for production)
const jumpLeaderboard = new Map<string, number>(); // address → jump count
// Start indexer: listen to events and update local state
async function startIndexer() {
console.log("Indexer starting...");
// First load historical data
await loadHistoricalEvents();
// Then subscribe to new events
await client.subscribeEvent({
filter: { Package: MY_PACKAGE },
onMessage: (event) => {
if (event.type.includes("GateJumped")) {
const { character_id } = event.parsedJson as any;
const count = jumpLeaderboard.get(character_id) ?? 0;
jumpLeaderboard.set(character_id, count + 1);
}
},
});
}
async function loadHistoricalEvents() {
let cursor = null;
do {
const page = await client.queryEvents({
query: { MoveEventType: `${MY_PACKAGE}::toll_gate_ext::GateJumped` },
cursor,
limit: 200,
});
for (const event of page.data) {
const { character_id } = event.parsedJson as any;
const count = jumpLeaderboard.get(character_id) ?? 0;
jumpLeaderboard.set(character_id, count + 1);
}
cursor = page.nextCursor;
} while (cursor && !cursor.startsWith("0x00")); // Simplified termination condition
}
// API: Provide leaderboard data
import express from "express";
const app = express();
app.get("/api/leaderboard", (req, res) => {
const sorted = [...jumpLeaderboard.entries()]
.sort((a, b) => b[1] - a[1])
.slice(0, 50)
.map(([address, count], rank) => ({ rank: rank + 1, address, count }));
res.json(sorted);
});
startIndexer().then(() => app.listen(3002));
9.7 Efficiently Displaying On-chain Data in dApps
Using React Query for Caching & Auto-refresh
// src/hooks/useLeaderboard.ts
import { useQuery } from "@tanstack/react-query";
export function useLeaderboard() {
return useQuery({
queryKey: ["leaderboard"],
queryFn: async () => {
const res = await fetch("/api/leaderboard");
return res.json();
},
refetchInterval: 30_000, // Refresh every 30 seconds
staleTime: 25_000, // Don't re-request within 25 seconds
});
}
// Usage
function Leaderboard() {
const { data, isLoading } = useLeaderboard();
return (
<table>
<thead><tr><th>#</th><th>Player</th><th>Jump Count</th></tr></thead>
<tbody>
{data?.map(({ rank, address, count }) => (
<tr key={address}>
<td>{rank}</td>
<td>{address.slice(0, 8)}...</td>
<td>{count}</td>
</tr>
))}
</tbody>
</table>
);
}
Chapter Summary
| Tool | Scenario | Features |
|---|---|---|
SuiClient.getObject() | Read single/multiple objects | Simple and direct |
GraphQL | Complex filtering, nested queries | Flexible, TypeScript type generation |
subscribeEvent | Real-time event push | WebSocket, suitable for dApps |
queryEvents | Historical event pagination query | Suitable for data analysis |
| Custom indexer | Complex aggregation, leaderboards | Full control, need to maintain yourself |
Further Reading
Chapter 10: EVE Vault & dApp Integration Practices
Learning Objectives: Master the complete process of integrating EVE Vault into Builder dApps—account discovery, connection, transaction signing, sponsored transactions, and handling zkLogin-specific Epoch refresh and disconnection scenarios.
Status: Teaching example. API descriptions in the text are based on current dependency versions and example dApps in this repository. Verify against local package versions during actual integration.
Minimal Call Chain
dApp Provider initialization -> useConnection wallet discovery -> Build PTB -> EVE Vault approval/signing -> On-chain execution -> dApp refresh object state
Wallet Capability Matrix
| Capability | Standard Wallet Standard Wallet | EVE Vault |
|---|---|---|
| Discovery & Connection | Supported | Supported |
| Regular Transaction Signing | Supported | Supported |
| Sponsored Tx | Usually not supported | Supported |
| zkLogin / Epoch Handling | Depends on wallet implementation | Built-in handling |
| In-game overlay integration | Usually none | Can cooperate with EVE Frontier scenarios |
This table isn’t for advertising, but to remind you: the integration layer must first detect wallet capabilities, then decide whether to show sponsored transaction entry points.
The real awareness this chapter should establish is:
Wallet integration isn’t just “connect and done,” but requires designing complete interaction fallback paths according to wallet capability differences.
In other words, your dApp cannot assume all wallets are equivalent.
Exception Handling Sequence
When users report “wallet connects but transactions won’t send,” check in this order:
- First confirm if the current wallet supports Sponsored Tx
- Then confirm network, package id, and object IDs are consistent
- Then confirm if zkLogin proof has expired, if
maxEpochneeds refresh - Finally check if frontend correctly handles disconnection and state recovery after reconnection
Corresponding Code Directories
1. dApp Integration Overview
Because EVE Vault implements the complete Sui Wallet Standard, any dApp using @mysten/dapp-kit or @evefrontier/dapp-kit can discover and connect to EVE Vault with zero configuration.
Meanwhile, EVE Vault also implements EVE Frontier’s proprietary sponsored transaction extension, allowing Builders to pay Gas for players.
So the integration layer typically needs to answer at least three things:
- Is there currently a wallet?
- Is it currently EVE Vault?
- Does the current operation require Sponsored Tx capability?
2. Install Dependencies
# EVE Frontier dedicated SDK (recommended, includes EVE Vault sponsored transaction support)
npm install @evefrontier/dapp-kit
# Or Mysten official SDK (basic Wallet Standard, no sponsored transactions)
npm install @mysten/dapp-kit
3. Provider Configuration
// src/main.tsx
import { EveFrontierProvider } from "@evefrontier/dapp-kit";
import { QueryClient } from "@tanstack/react-query";
import ReactDOM from "react-dom/client";
const queryClient = new QueryClient();
ReactDOM.createRoot(document.getElementById("root")!).render(
<EveFrontierProvider queryClient={queryClient}>
<App />
</EveFrontierProvider>,
);
EveFrontierProvider automatically initializes:
- QueryClientProvider (React Query)
- DAppKitProvider (Sui client + Wallet)
- VaultProvider (EVE Vault connection state)
- SmartObjectProvider (Game object GraphQL queries)
- NotificationProvider (On-chain operation notifications)
The key to Provider here isn’t “how many layers are wrapped,” but capability ordering.
Your later connection, signing, object queries, and notification experience all depend on this initialization order being correct.
4. Connect Wallet
import { useConnection, abbreviateAddress } from "@evefrontier/dapp-kit";
import { useCurrentAccount } from "@mysten/dapp-kit-react";
function ConnectButton() {
const { handleConnect, handleDisconnect, isConnected, walletAddress, hasEveVault } = useConnection();
const account = useCurrentAccount();
if (!isConnected) {
return (
<div>
<button onClick={handleConnect}>Connect EVE Vault</button>
{!hasEveVault && (
<p style={{ color: "orange" }}>
Please install <a href="https://github.com/evefrontier/evevault/releases/latest/download/eve-vault-chrome.zip">EVE Vault extension</a>
</p>
)}
</div>
);
}
return (
<div>
<span>Connected: {abbreviateAddress(account?.address ?? "")}</span>
<button onClick={handleDisconnect}>Disconnect</button>
</div>
);
}
Meaning of hasEveVault
When hasEveVault is true, it means the EVE Vault extension is installed and discovered in the wallet list. This lets you provide download link guidance to users who haven’t installed it.
The most easily overlooked issue in the connection flow isn’t “can the button light up,” but whether the page switches to the correct state immediately after connecting:
- Is the current address refreshed?
- Are needed object queries refetched?
- Do buttons that depend on wallet capabilities switch display?
5. Send Transaction (Regular Signing)
import { useDAppKit } from "@mysten/dapp-kit-react";
import { Transaction } from "@mysten/sui/transactions";
import { useConnection } from "@evefrontier/dapp-kit";
function SendTxButton() {
const { signAndExecuteTransaction } = useDAppKit();
const { isConnected } = useConnection();
const handleSend = async () => {
const tx = new Transaction();
// Call Builder contract
tx.moveCall({
target: `${PACKAGE_ID}::tribe_permit::issue_jump_permit`,
arguments: [
tx.object(EXTENSION_CONFIG_ID),
tx.object(SOURCE_GATE_ID),
tx.object(DEST_GATE_ID),
tx.object(CHARACTER_ID),
tx.object("0x6"), // Sui Clock (fixed object ID)
],
});
try {
const result = await signAndExecuteTransaction({ transaction: tx });
console.log("Transaction successful, Digest:", result.digest);
} catch (err) {
// EVE Vault approval popup closed by user
if (err.message?.includes("User rejected")) {
alert("Transaction cancelled by user");
}
}
};
return <button onClick={handleSend} disabled={!isConnected}>Issue Permit</button>;
}
The key to regular signing flow isn’t that the code can call, but that users can understand what they’re signing.
So before transaction buttons, it’s best to explain as clearly as possible:
- Target object
- Key costs
- Expected results
Rather than leaving everything to the wallet approval page.
6. Sponsored Transaction (Sponsored TX)—Most Important Feature
EVE Vault is the only Sui wallet that implements sign_sponsored_transaction. This means Builder’s server can pay Gas for players, so players don’t need to hold SUI to use the dApp.
import { useSponsoredTransaction, WalletSponsoredTransactionNotSupportedError } from "@evefrontier/dapp-kit";
import { Transaction } from "@mysten/sui/transactions";
function SponsoredTxButton() {
const { sponsoredSignAndExecute } = useSponsoredTransaction();
const handleSponsoredTx = async () => {
const tx = new Transaction();
tx.moveCall({
target: `${PACKAGE_ID}::my_extension::some_action`,
arguments: [/* ... */],
});
try {
// Player signs, Gas sponsored by Builder server
const result = await sponsoredSignAndExecute({ transaction: tx });
console.log("Sponsored transaction successful!", result.digest);
} catch (err) {
if (err instanceof WalletSponsoredTransactionNotSupportedError) {
// User using non-EVE Vault wallet, fallback to regular transaction
console.warn("Current wallet doesn't support sponsored transactions, please use EVE Vault");
// Can fallback to signAndExecuteTransaction
}
}
};
return <button onClick={handleSponsoredTx}>Gas-free Operation (EVE Vault Sponsored)</button>;
}
Builder Server-side Sponsorship Configuration
Sponsored transactions require Builder to configure a Gas sponsor account on the server side:
// Builder backend (Node.js)
import { SuiClient } from "@mysten/sui/client";
import { Ed25519Keypair } from "@mysten/sui/keypairs/ed25519";
import { Transaction } from "@mysten/sui/transactions";
const sponsorKeypair = Ed25519Keypair.fromSecretKey(SPONSOR_PRIVATE_KEY);
// Receive player's PTB, add Gas and sign back
app.post("/sponsor-tx", async (req, res) => {
const { serializedTx } = req.body;
const tx = Transaction.from(serializedTx);
// Set Gas sponsor
tx.setSender(playerAddress);
tx.setGasOwner(sponsorKeypair.getPublicKey().toSuiAddress());
const sponsorSignature = await tx.sign({ signer: sponsorKeypair, client });
res.json({ sponsorSignature, serializedTx: tx.serialize() });
});
The key to Sponsored Tx integration isn’t “saving Gas,” but “frontend-backend coordination.”
It requires at least three layers working correctly together:
- Frontend can identify wallet capabilities
- Backend can correctly supplement Gas and signatures
- Wallet can complete corresponding approval process
As long as one layer’s calibration is inconsistent, users will see “can connect but can’t send anything out.”
7. Read Game Object (Smart Object)
import { useSmartObject } from "@evefrontier/dapp-kit";
function GateStatus({ gateItemId }: { gateItemId: string }) {
const { assembly, character, loading, error, refetch } = useSmartObject({
itemId: gateItemId,
});
if (loading) return <div>Loading...</div>;
if (error) return <div>Error: {error.message}</div>;
if (!assembly) return <div>Gate not found</div>;
return (
<div>
<h2>{assembly.name}</h2>
<p>Type ID: {assembly.typeId}</p>
<p>Status: {assembly.state}</p>
<p>Owner: {character?.name ?? "Unknown"}</p>
<button onClick={refetch}>Refresh</button>
</div>
);
}
8. zkLogin Epoch Refresh Handling
zkLogin’s temporary keypair is bound to Sui Epoch (approximately 24 hours). When Epoch expires, keys and ZK Proof need to be regenerated:
import { useConnection } from "@evefrontier/dapp-kit";
import { useDAppKit } from "@mysten/dapp-kit-react";
function TransactionButton() {
const { isConnected, walletAddress } = useConnection();
const { signAndExecuteTransaction } = useDAppKit();
const handleTransaction = async () => {
const tx = new Transaction();
// ...build transaction...
try {
await signAndExecuteTransaction({ transaction: tx });
} catch (err) {
const errMsg = err?.message ?? "";
if (errMsg.includes("ZK proof") || errMsg.includes("maxEpoch")) {
// Epoch expired, ZK Proof invalid
// EVE Vault will automatically pop up re-verification guidance
alert("Your login has expired, please refresh login status in EVE Vault");
} else if (errMsg.includes("User rejected")) {
// User cancelled transaction on approval page
console.log("User cancelled operation");
} else {
console.error("Transaction failed:", errMsg);
}
}
};
return <button onClick={handleTransaction} disabled={!isConnected}>Execute Operation</button>;
}
9. Listen for Network Switching
EVE Vault supports users switching between Devnet/Testnet. dApp needs to respond to this change:
import { useCurrentAccount } from "@mysten/dapp-kit-react";
import { useEffect } from "react";
function NetworkAwareComponent() {
const account = useCurrentAccount();
useEffect(() => {
if (!account) return;
// account.chains contains chains current wallet supports
const currentChain = account.chains[0]; // "sui:testnet" or "sui:devnet"
console.log("Current network:", currentChain);
// Switch API endpoints or contract addresses based on network
}, [account]);
// ...
}
10. Message Signing (Personal Message)
import { useDAppKit } from "@mysten/dapp-kit-react";
import { toBase64 } from "@mysten/sui/utils";
function SignMessageButton() {
const { signPersonalMessage } = useDAppKit();
const handleSign = async () => {
const message = new TextEncoder().encode("EVE Frontier Builder Auth: " + Date.now());
const { bytes, signature } = await signPersonalMessage({
message,
});
console.log("Message signature:", signature);
// Can send signature to server to verify user identity (link game account to builder system)
};
return <button onClick={handleSign}>Verify Identity with EVE Vault</button>;
}
11. Complete Example: Gate Extension dApp
Here’s a minimal complete example integrating all features:
// src/App.tsx
import { useConnection, useSmartObject, abbreviateAddress } from "@evefrontier/dapp-kit";
import { useDAppKit } from "@mysten/dapp-kit-react";
import { useSponsoredTransaction } from "@evefrontier/dapp-kit";
import { Transaction } from "@mysten/sui/transactions";
const GATE_ITEM_ID = import.meta.env.VITE_GATE_ITEM_ID;
const PACKAGE_ID = import.meta.env.VITE_BUILDER_PACKAGE_ID;
const EXTENSION_CONFIG_ID = import.meta.env.VITE_EXTENSION_CONFIG_ID;
export function App() {
const { handleConnect, handleDisconnect, isConnected, hasEveVault } = useConnection();
const { assembly, loading } = useSmartObject({ itemId: GATE_ITEM_ID });
const { signAndExecuteTransaction } = useDAppKit();
const { sponsoredSignAndExecute } = useSponsoredTransaction();
const requestJumpPermit = async () => {
const tx = new Transaction();
tx.moveCall({
target: `${PACKAGE_ID}::tribe_permit::issue_jump_permit`,
arguments: [tx.object(EXTENSION_CONFIG_ID), /* ... */],
});
await signAndExecuteTransaction({ transaction: tx });
};
const requestFreeJump = async () => {
// Sponsored transaction version (Builder pays Gas)
const tx = new Transaction();
tx.moveCall({ /* same as above */ });
await sponsoredSignAndExecute({ transaction: tx });
};
return (
<div>
{/* Top bar */}
<header>
<h1>Star Gate Manager</h1>
<button onClick={isConnected ? handleDisconnect : handleConnect}>
{isConnected ? "Disconnect Wallet" : "Connect EVE Vault"}
</button>
</header>
{/* Gate status card */}
{!loading && assembly && (
<div>
<h2>{assembly.name}</h2>
<p>Current status: {assembly.state}</p>
</div>
)}
{/* Action buttons */}
{isConnected && (
<div>
<button onClick={requestJumpPermit}>Request Permit (Pay Gas)</button>
<button onClick={requestFreeJump}>Free Request (Sponsored Transaction)</button>
</div>
)}
{/* EVE Vault not installed prompt */}
{!hasEveVault && (
<div style={{ background: "#fff3cd", padding: 12, borderRadius: 8 }}>
⚠️ Please install{" "}
<a href="https://github.com/evefrontier/evevault/releases/latest/download/eve-vault-chrome.zip">
EVE Vault extension
</a>{" "}
to connect your EVE Frontier account
</div>
)}
</div>
);
}
12. Common Integration Issues
| Issue | Cause | Solution |
|---|---|---|
WalletSponsoredTransactionNotSupportedError | User using non-EVE Vault wallet | Catch error, fallback to regular transaction |
| Approval popup doesn’t appear | Chrome blocked popup | Tell user to check block notification in top-right corner |
maxEpoch exceeded | ZK Proof expired | Prompt user to refresh in EVE Vault popup |
hasEveVault = false | Extension not installed or activated | Show download link and installation guide |
| Network mismatch | dApp expects testnet, wallet on devnet | Listen to account.chains, prompt user to switch network |
Chapter Summary
| Feature | API |
|---|---|
| Detect wallet installation | useConnection().hasEveVault |
| Connect/Disconnect | handleConnect / handleDisconnect |
| Regular transactions | useDAppKit().signAndExecuteTransaction |
| Sponsored transactions | useSponsoredTransaction().sponsoredSignAndExecute |
| Message signing | useDAppKit().signPersonalMessage |
| Read game objects | useSmartObject({ itemId }) |
| Listen for network switching | useCurrentAccount().chains |
Further Reading
- EVE Vault GitHub
- Sui zkLogin Official Documentation
- Enoki Documentation
- Sui Wallet Standard
- @evefrontier/dapp-kit SDK Documentation
You now have mastered the complete knowledge system of the EVE Frontier Builder course: from Move 2024 basics to deep analysis of World contracts, from Builder Scaffold engineering practices to EVE Vault wallet integration. It’s time to leave your mark in the stars.
Example 4: Quest Unlock System (On-Chain Quests + Conditional Gate)
Goal: Build an on-chain quest system: players complete specified quests, on-chain records completion status; gate extension reads quest status, only allows players who completed quests to jump. Also provides quest publishing and verification dApp.
Status: Mapped to local code directory. Main content focuses on decoupling quest state and conditional gate, suitable for permission-based gameplay entry.
Code Directory
Minimal Call Chain
Register quest -> Player completes quest -> On-chain records status -> Gate reads quest status -> Allow or deny
Requirements Analysis
Scenario: You operate a gate leading to a high-value mining area. Players must first complete a series of “membership tests” to enter:
- 📋 Quest 1: Donate 100 units of ore to your storage box (verifiable on-chain)
- 🔑 Quest 2: Obtain on-chain certification issued by alliance Leader
- 🚪 Complete all quests → Can pass through the gate to enter mining area
Design Features:
- Quest status is entirely on-chain, cannot be forged
- Quest system and gate system are decoupled, easy to upgrade independently
- dApp provides quest progress tracking and one-click jump application
Part 1: Quest System Contract
quest_registry.move
module quest_system::registry;
use sui::object::{Self, UID, ID};
use sui::table::{Self, Table};
use sui::event;
use sui::tx_context::TxContext;
use sui::transfer;
/// Quest types (using u8 enum)
const QUEST_DONATE_ORE: u8 = 0;
const QUEST_LEADER_CERT: u8 = 1;
/// Quest completion status (bit flags)
/// bit 0: QUEST_DONATE_ORE completed
/// bit 1: QUEST_LEADER_CERT completed
const QUEST_ALL_COMPLETE: u64 = 0b11;
/// Quest Registry (shared object)
public struct QuestRegistry has key {
id: UID,
gate_id: ID, // Which gate this corresponds to
completions: Table<address, u64>, // address → completion bit flags
}
/// Quest admin credential
public struct QuestAdminCap has key, store {
id: UID,
registry_id: ID,
}
/// Events
public struct QuestCompleted has copy, drop {
registry_id: ID,
player: address,
quest_type: u8,
all_done: bool,
}
/// Deploy: Create quest registry
public fun create_registry(
gate_id: ID,
ctx: &mut TxContext,
) {
let registry = QuestRegistry {
id: object::new(ctx),
gate_id,
completions: table::new(ctx),
};
let admin_cap = QuestAdminCap {
id: object::new(ctx),
registry_id: object::id(®istry),
};
transfer::share_object(registry);
transfer::transfer(admin_cap, ctx.sender());
}
/// Admin marks quest complete (called by alliance Leader or management script)
public fun mark_quest_complete(
registry: &mut QuestRegistry,
cap: &QuestAdminCap,
player: address,
quest_type: u8,
ctx: &TxContext,
) {
assert!(cap.registry_id == object::id(registry), ECapMismatch);
// Initialize player entry
if !table::contains(®istry.completions, player) {
table::add(&mut registry.completions, player, 0u64);
};
let flags = table::borrow_mut(&mut registry.completions, player);
*flags = *flags | (1u64 << (quest_type as u64));
let all_done = *flags == QUEST_ALL_COMPLETE;
event::emit(QuestCompleted {
registry_id: object::id(registry),
player,
quest_type,
all_done,
});
}
/// Query if player completed all quests
public fun is_all_complete(registry: &QuestRegistry, player: address): bool {
if !table::contains(®istry.completions, player) {
return false
}
*table::borrow(®istry.completions, player) == QUEST_ALL_COMPLETE
}
/// Query which quests player completed
public fun get_completion_flags(registry: &QuestRegistry, player: address): u64 {
if !table::contains(®istry.completions, player) {
return 0
}
*table::borrow(®istry.completions, player)
}
const ECapMismatch: u64 = 0;
quest_gate.move (Gate Extension)
module quest_system::quest_gate;
use quest_system::registry::{Self, QuestRegistry};
use world::gate::{Self, Gate};
use world::character::Character;
use sui::clock::Clock;
use sui::tx_context::TxContext;
/// Gate extension Witness
public struct QuestGateAuth has drop {}
/// Request jump permit after completing quests
public fun quest_jump(
source_gate: &Gate,
dest_gate: &Gate,
character: &Character,
quest_registry: &QuestRegistry,
clock: &Clock,
ctx: &mut TxContext,
) {
// Verify caller completed all quests
assert!(
registry::is_all_complete(quest_registry, ctx.sender()),
EQuestsNotComplete,
);
// Issue jump permit (valid for 30 minutes)
let expires_at = clock.timestamp_ms() + 30 * 60 * 1000;
gate::issue_jump_permit(
source_gate,
dest_gate,
character,
QuestGateAuth {},
expires_at,
ctx,
);
}
const EQuestsNotComplete: u64 = 0;
Part 2: Quest Verification Logic (Quest 1: Donate Ore)
Quest 1 (donate ore) requires off-chain monitoring of SSU storage events, then admin manually (or script automatically) marks completion.
// scripts/auto-quest-monitor.ts
import { SuiClient } from "@mysten/sui/client"
import { Transaction } from "@mysten/sui/transactions"
import { Ed25519Keypair } from "@mysten/sui/keypairs/ed25519"
const QUEST_PACKAGE = "0x_QUEST_PACKAGE_"
const REGISTRY_ID = "0x_REGISTRY_ID_"
const QUEST_ADMIN_CAP_ID = "0x_QUEST_ADMIN_CAP_"
const STORAGE_UNIT_ID = "0x_SSU_ID_"
const DONATE_ORE_TYPE_ID = 12345 // Ore item type ID
const client = new SuiClient({ url: "https://fullnode.testnet.sui.io:443" })
const adminKeypair = Ed25519Keypair.fromSecretKey(/* ... */)
// Monitor SSU donation events
async function monitorDonations() {
await client.subscribeEvent({
filter: {
MoveEventType: `${"0x_WORLD_PACKAGE_"}::storage_unit::ItemDeposited`,
},
onMessage: async (event) => {
const { depositor, storage_unit_id, item_type_id } = event.parsedJson as any
// Check if it's our SSU and specified item
if (
storage_unit_id === STORAGE_UNIT_ID &&
Number(item_type_id) === DONATE_ORE_TYPE_ID
) {
console.log(`Player ${depositor} donated ore, marking quest complete...`)
await markQuestComplete(depositor, 0) // quest_type = 0 (QUEST_DONATE_ORE)
}
},
})
}
async function markQuestComplete(player: string, questType: number) {
const tx = new Transaction()
tx.moveCall({
target: `${QUEST_PACKAGE}::registry::mark_quest_complete`,
arguments: [
tx.object(REGISTRY_ID),
tx.object(QUEST_ADMIN_CAP_ID),
tx.pure.address(player),
tx.pure.u8(questType),
],
})
const result = await client.signAndExecuteTransaction({
signer: adminKeypair,
transaction: tx,
})
console.log(`Quest marked successfully: ${result.digest}`)
}
monitorDonations()
Part 3: Quest Tracker dApp
// src/QuestTrackerApp.tsx
import { useState, useEffect } from 'react'
import { useConnection, getObjectWithJson } from '@evefrontier/dapp-kit'
import { useDAppKit } from '@mysten/dapp-kit-react'
import { Transaction } from '@mysten/sui/transactions'
import { SuiClient } from '@mysten/sui/client'
const QUEST_PACKAGE = "0x_QUEST_PACKAGE_"
const REGISTRY_ID = "0x_REGISTRY_ID_"
const SOURCE_GATE_ID = "0x..."
const DEST_GATE_ID = "0x..."
const CHARACTER_ID = "0x..."
const QUEST_NAMES = [
{ id: 0, name: 'Donate Ore', description: 'Deposit 100 units of ore into alliance storage' },
{ id: 1, name: 'Get Certified', description: 'Contact alliance Leader to issue on-chain certification' },
]
export function QuestTrackerApp() {
const { isConnected, handleConnect, currentAddress } = useConnection()
const dAppKit = useDAppKit()
const [flags, setFlags] = useState<number>(0)
const [isJumping, setIsJumping] = useState(false)
const [status, setStatus] = useState('')
const allComplete = flags === 0b11
// Load quest completion status
useEffect(() => {
if (!currentAddress) return
const loadFlags = async () => {
// Read player entry in table via GraphQL
const client = new SuiClient({ url: 'https://fullnode.testnet.sui.io:443' })
const obj = await client.getDynamicFieldObject({
parentId: REGISTRY_ID,
name: {
type: 'address',
value: currentAddress,
},
})
if (obj.data?.content?.dataType === 'moveObject') {
setFlags(Number((obj.data.content.fields as any).value))
} else {
setFlags(0) // Player has no record yet
}
}
loadFlags()
}, [currentAddress])
const handleJump = async () => {
if (!allComplete) {
setStatus('❌ Please complete all quests first')
return
}
setIsJumping(true)
setStatus('⏳ Requesting jump permit...')
try {
const tx = new Transaction()
tx.moveCall({
target: `${QUEST_PACKAGE}::quest_gate::quest_jump`,
arguments: [
tx.object(SOURCE_GATE_ID),
tx.object(DEST_GATE_ID),
tx.object(CHARACTER_ID),
tx.object(REGISTRY_ID),
tx.object('0x6'), // Clock
],
})
await dAppKit.signAndExecuteTransaction({ transaction: tx })
setStatus('🚀 Jump permit obtained, enjoy the mining area!')
} catch (e: any) {
setStatus(`❌ ${e.message}`)
} finally {
setIsJumping(false)
}
}
return (
<div className="quest-tracker">
<h1>🌟 Alliance Membership Test</h1>
{!isConnected ? (
<button onClick={handleConnect}>Connect Wallet</button>
) : (
<>
<div className="quest-list">
{QUEST_NAMES.map(quest => {
const done = (flags & (1 << quest.id)) !== 0
return (
<div key={quest.id} className={`quest-item ${done ? 'done' : 'pending'}`}>
<span className="quest-icon">{done ? '✅' : '⬜'}</span>
<div>
<strong>{quest.name}</strong>
<p>{quest.description}</p>
</div>
</div>
)
})}
</div>
<div className="progress">
Completion Progress: {Object.keys(QUEST_NAMES)
.filter(i => (flags & (1 << Number(i))) !== 0).length} / {QUEST_NAMES.length}
</div>
<button
className={`jump-btn ${allComplete ? 'active' : 'locked'}`}
onClick={handleJump}
disabled={!allComplete || isJumping}
>
{allComplete
? (isJumping ? '⏳ Requesting...' : '🚀 Enter Mining Area')
: '🔒 Complete all quests to enter'
}
</button>
{status && <p className="status">{status}</p>}
</>
)}
</div>
)
}
🎯 Complete Review
Contract Layer
├── quest_registry.move
│ ├── QuestRegistry (shared object, stores player completion bit flags)
│ ├── QuestAdminCap (admin credential)
│ ├── mark_quest_complete() ← Admin calls
│ └── is_all_complete() ← Gate contract calls
│
└── quest_gate.move
├── QuestGateAuth (gate extension Witness)
└── quest_jump() ← Player calls
├── registry::is_all_complete() → Verify quest completion
└── gate::issue_jump_permit() → Issue permit
Off-Chain Monitoring
└── auto-quest-monitor.ts
├── Subscribe to SSU ItemDeposited events
└── Automatically call mark_quest_complete()
dApp Layer
└── QuestTrackerApp.tsx
├── Display quest progress (decode bit flags)
└── One-click jump permit request
🔧 Extension Exercises
- Quest Expiration: Quests valid for 7 days after completion, expired need re-completion (store timestamp alongside bit flags)
- On-Chain Quest 1 (no off-chain needed): Player actively calls
donate_ore()function, directly transfers item, contract automatically marks quest complete - Quest Points: Each quest has different point weight, unlock gate when total reaches threshold
📚 Related Documentation
Practical Case 11: Item Rental System (Rent Instead of Sell)
Objective: Build an on-chain item rental marketplace—item owners rent out instead of selling equipment, renters have usage rights during the validity period, and items are automatically returned after expiration (or can be redeemed).
Status: Teaching example. The main text explains the core business flow, complete directory is based on local
book/src/code/example-11/.
Corresponding Code Directory
Minimal Call Chain
Create listing -> User rents -> Contract mints RentalPass -> Expiration or early return -> Fund settlement
Test Loop
- Listing creation: Confirm
is_available == true, and can be correctly queried by frontend - Successful rental: Confirm renter receives
RentalPass, owner receives 70% rent - Early return: Confirm refund is calculated based on remaining days, remaining deposit correctly flows to owner
- Expiration reclaim: Confirm reclaim fails before expiration, succeeds after expiration
Requirements Analysis
Scenario: High-end ship modules are expensive, most players can’t afford them, but can rent them:
- Owner locks module into rental contract, sets daily rent and maximum rental period
- Renter pays rent, receives temporary usage rights credential NFT (
RentalPass) - Usage rights credential carries expiration timestamp, contract verifies validity when used
- After expiration, owner can reclaim the module (or renew)
- If renter returns early, refund remaining days’ rent
Part One: Rental Contract
module rental::equipment_rental;
use sui::object::{Self, UID, ID};
use sui::table::{Self, Table};
use sui::clock::Clock;
use sui::coin::{Self, Coin};
use sui::sui::SUI;
use sui::balance::{Self, Balance};
use sui::transfer;
use sui::event;
use std::string::String;
// ── Constants ──────────────────────────────────────────────────
const DAY_MS: u64 = 86_400_000;
// ── Data Structures ───────────────────────────────────────────────
/// Rental listing (locks item)
public struct RentalListing has key {
id: UID,
item_id: ID, // Rented item object ID
item_name: String,
owner: address,
daily_rate_sui: u64, // Daily rent (MIST)
max_days: u64, // Maximum rental period
deposited_balance: Balance<SUI>, // Owner's pre-deposited security deposit (optional)
is_available: bool,
current_renter: option::Option<address>,
lease_expires_ms: u64,
}
/// Rental pass NFT (held by renter)
public struct RentalPass has key, store {
id: UID,
listing_id: ID,
item_name: String,
renter: address,
expires_ms: u64,
prepaid_days: u64,
refundable_balance: Balance<SUI>, // Refundable balance (for early return)
}
// ── Events ──────────────────────────────────────────────────
public struct ItemRented has copy, drop {
listing_id: ID,
renter: address,
days: u64,
total_paid: u64,
expires_ms: u64,
}
public struct ItemReturned has copy, drop {
listing_id: ID,
renter: address,
early: bool,
refund_amount: u64,
}
// ── Owner Operations ────────────────────────────────────────────
/// Create rental listing
public fun create_listing(
item_name: vector<u8>,
tracked_item_id: ID, // Item's Object ID (contract tracks, actual item in SSU)
daily_rate_sui: u64,
max_days: u64,
ctx: &mut TxContext,
) {
let listing = RentalListing {
id: object::new(ctx),
item_id: tracked_item_id,
item_name: std::string::utf8(item_name),
owner: ctx.sender(),
daily_rate_sui,
max_days,
deposited_balance: balance::zero(),
is_available: true,
current_renter: option::none(),
lease_expires_ms: 0,
};
transfer::share_object(listing);
}
/// Delist (can only withdraw when item is not rented)
public fun delist(
listing: &mut RentalListing,
ctx: &TxContext,
) {
assert!(listing.owner == ctx.sender(), ENotOwner);
assert!(listing.is_available, EItemCurrentlyRented);
listing.is_available = false;
}
// ── Renter Operations ────────────────────────────────────────────
/// Rent item
public fun rent_item(
listing: &mut RentalListing,
days: u64,
mut payment: Coin<SUI>,
clock: &Clock,
ctx: &mut TxContext,
) {
assert!(listing.is_available, ENotAvailable);
assert!(days >= 1 && days <= listing.max_days, EInvalidDays);
let total_cost = listing.daily_rate_sui * days;
assert!(coin::value(&payment) >= total_cost, EInsufficientPayment);
let expires_ms = clock.timestamp_ms() + days * DAY_MS;
// Deduct rent
let rent_payment = payment.split(total_cost, ctx);
// Send 70% to owner, remaining 30% locked in RentalPass as deposit (refunded on early return)
let owner_share = rent_payment.split(total_cost * 70 / 100, ctx);
transfer::public_transfer(owner_share, listing.owner);
// Update listing state
listing.is_available = false;
listing.current_renter = option::some(ctx.sender());
listing.lease_expires_ms = expires_ms;
// Issue RentalPass NFT
let pass = RentalPass {
id: object::new(ctx),
listing_id: object::id(listing),
item_name: listing.item_name,
renter: ctx.sender(),
expires_ms,
prepaid_days: days,
refundable_balance: coin::into_balance(rent_payment), // Remaining 30%
};
// Return change
if coin::value(&payment) > 0 {
transfer::public_transfer(payment, ctx.sender());
} else { coin::destroy_zero(payment); }
transfer::public_transfer(pass, ctx.sender());
event::emit(ItemRented {
listing_id: object::id(listing),
renter: ctx.sender(),
days,
total_paid: total_cost,
expires_ms,
});
}
/// Verify rental validity when using item
public fun verify_rental(
pass: &RentalPass,
listing_id: ID,
clock: &Clock,
): bool {
pass.listing_id == listing_id
&& clock.timestamp_ms() <= pass.expires_ms
}
/// Early return (refund deposit)
public fun return_early(
listing: &mut RentalListing,
mut pass: RentalPass,
clock: &Clock,
ctx: &mut TxContext,
) {
assert!(pass.listing_id == object::id(listing), EWrongListing);
assert!(pass.renter == ctx.sender(), ENotRenter);
assert!(clock.timestamp_ms() < pass.expires_ms, EAlreadyExpired);
// Calculate refund for remaining days
let remaining_ms = pass.expires_ms - clock.timestamp_ms();
let remaining_days = remaining_ms / DAY_MS;
let refund = if remaining_days > 0 {
balance::value(&pass.refundable_balance) * remaining_days / pass.prepaid_days
} else { 0 };
// Refund
if refund > 0 {
let refund_coin = coin::take(&mut pass.refundable_balance, refund, ctx);
transfer::public_transfer(refund_coin, ctx.sender());
};
// Destroy remaining deposit to owner
let remaining_bal = balance::withdraw_all(&mut pass.refundable_balance);
if balance::value(&remaining_bal) > 0 {
transfer::public_transfer(coin::from_balance(remaining_bal, ctx), listing.owner);
} else { balance::destroy_zero(remaining_bal); }
// Return listing availability
listing.is_available = true;
listing.current_renter = option::none();
let RentalPass { id, refundable_balance, .. } = pass;
balance::destroy_zero(refundable_balance);
id.delete();
event::emit(ItemReturned {
listing_id: object::id(listing),
renter: ctx.sender(),
early: true,
refund_amount: refund,
});
}
/// After rental expires, owner reclaims control
public fun reclaim_after_expiry(
listing: &mut RentalListing,
clock: &Clock,
ctx: &TxContext,
) {
assert!(listing.owner == ctx.sender(), ENotOwner);
assert!(!listing.is_available, EAlreadyAvailable);
assert!(clock.timestamp_ms() > listing.lease_expires_ms, ELeaseNotExpired);
listing.is_available = true;
listing.current_renter = option::none();
}
// ── Error Codes ────────────────────────────────────────────────
const ENotOwner: u64 = 0;
const EItemCurrentlyRented: u64 = 1;
const ENotAvailable: u64 = 2;
const EInvalidDays: u64 = 3;
const EInsufficientPayment: u64 = 4;
const EWrongListing: u64 = 5;
const ENotRenter: u64 = 6;
const EAlreadyExpired: u64 = 7;
const EAlreadyAvailable: u64 = 8;
const ELeaseNotExpired: u64 = 9;
Part Two: Rental Market dApp
// src/RentalMarket.tsx
import { useState } from 'react'
import { useCurrentClient } from '@mysten/dapp-kit-react'
import { useQuery } from '@tanstack/react-query'
import { Transaction } from '@mysten/sui/transactions'
import { useDAppKit } from '@mysten/dapp-kit-react'
const RENTAL_PKG = "0x_RENTAL_PACKAGE_"
interface Listing {
id: string
item_name: string
owner: string
daily_rate_sui: string
max_days: string
is_available: boolean
lease_expires_ms: string
}
function DaysLeftBadge({ expireMs }: { expireMs: number }) {
const remaining = Math.max(0, expireMs - Date.now())
const days = Math.ceil(remaining / 86400000)
if (days === 0) return <span className="badge badge--expired">Expired</span>
return <span className="badge badge--active">{days} days remaining</span>
}
export function RentalMarket() {
const client = useCurrentClient()
const dAppKit = useDAppKit()
const [rentDays, setRentDays] = useState(1)
const [status, setStatus] = useState('')
const { data: listings } = useQuery({
queryKey: ['rental-listings'],
queryFn: async () => {
// Teaching example: directly read current listing objects.
// Real projects should maintain "rentable listing" view through indexer, rather than reverse listing from rental events.
const objects = await client.getOwnedObjects({
owner: '0x_RENTAL_REGISTRY_OWNER_',
filter: { StructType: `${RENTAL_PKG}::equipment_rental::RentalListing` },
options: { showContent: true },
})
return objects.data.map(obj => (obj.data?.content as any)?.fields).filter(Boolean) as Listing[]
},
})
const handleRent = async (listingId: string, dailyRate: number) => {
const tx = new Transaction()
const totalCost = BigInt(dailyRate * rentDays)
const [payment] = tx.splitCoins(tx.gas, [tx.pure.u64(totalCost)])
tx.moveCall({
target: `${RENTAL_PKG}::equipment_rental::rent_item`,
arguments: [
tx.object(listingId),
tx.pure.u64(rentDays),
payment,
tx.object('0x6'),
],
})
try {
setStatus('Submitting rental transaction...')
await dAppKit.signAndExecuteTransaction({ transaction: tx })
setStatus('Rental successful! RentalPass sent to your wallet')
} catch (e: any) {
setStatus(`${e.message}`)
}
}
return (
<div className="rental-market">
<h1>Equipment Rental Market</h1>
<p className="subtitle">Rent instead of buy, flexibly use high-end equipment</p>
<div className="rent-days-selector">
<label>Rental Period:</label>
{[1, 3, 7, 14, 30].map(d => (
<button
key={d}
className={rentDays === d ? 'selected' : ''}
onClick={() => setRentDays(d)}
>
{d} days
</button>
))}
</div>
<div className="listings-grid">
{listings?.map(listing => (
<div key={listing.id} className="listing-card">
<h3>{listing.item_name}</h3>
<div className="listing-meta">
<span>{Number(listing.daily_rate_sui) / 1e9} SUI/day</span>
<span>Max {listing.max_days} days</span>
</div>
<div className="listing-cost">
Rent {rentDays} days total: <strong>{Number(listing.daily_rate_sui) * rentDays / 1e9} SUI</strong>
</div>
{listing.is_available ? (
<button
className="rent-btn"
onClick={() => handleRent(listing.id, Number(listing.daily_rate_sui))}
>
Rent Now
</button>
) : (
<DaysLeftBadge expireMs={Number(listing.lease_expires_ms)} />
)}
</div>
))}
</div>
{status && <p className="status">{status}</p>}
</div>
)
}
Key Design Highlights
| Mechanism | Implementation |
|---|---|
| Time control | RentalPass.expires_ms + clock.timestamp_ms() real-time verification |
| Deposit management | 30% rent locked in RentalPass.refundable_balance |
| Early return | Refund based on remaining days proportion, rest goes to owner |
| Expiration reclaim | reclaim_after_expiry() called by owner after expiration |
| Double rental prevention | is_available flag ensures only one renter at a time |
Related Documentation
Chapter 11: Deep Dive into Ownership Model
Objective: Deeply understand EVE Frontier’s capability object system, master the complete lifecycle of OwnerCap, and learn to design secure delegation authorization and ownership transfer schemes.
Status: Advanced design chapter. Text focuses on OwnerCap, delegation, and ownership lifecycle.
11.1 Why Have a Dedicated Ownership Model?
When many newcomers first design a permission system, the intuition is:
- Record an owner address
- Check if the caller is this address for every operation
This approach is convenient in the short term, but once entering EVE Frontier’s world of “facilities that can be operated, transferred, delegated, and composed,” problems quickly emerge:
- Not delegable It’s hard to safely hand over partial power temporarily to others
- Not composable Permission rules scattered across functions, system becomes increasingly chaotic
- Cannot express fine-grained control Hard to express “can operate this turret, but not that gate”
- Not naturally transferable Once facilities, characters, and operating rights migrate, hardcoded addresses become fragile
EVE Frontier uses Sui’s native Capability object system. Its core idea isn’t “who are you,” but:
What permission object are you holding.
This transforms ownership from “account attribute” to “composable, transferable, verifiable on-chain entity.”
11.2 Permission Hierarchy Structure
GovernorCap (deployer holds — highest permission)
│
└── AdminACL (shared object — authorized server address list)
│
└── OwnerCap<T> (player holds — operation rights for specific objects)
GovernorCap: Game Operation Layer
GovernorCap is created during contract deployment, held by CCP Games (game operators). It can:
- Add/remove server authorization addresses to
AdminACL - Execute global configuration changes
As a Builder, you don’t need to worry about GovernorCap.
AdminACL: Server Authorization Layer
AdminACL is a shared object containing a list of authorized game server addresses.
Certain operations (like proximity proof, jump verification) require game server as sponsor to sign transactions:
// Verify if caller is authorized sponsor
public fun verify_sponsor(admin_acl: &AdminACL, ctx: &TxContext) {
assert!(
admin_acl.sponsors.contains(ctx.sponsor().unwrap()),
EUnauthorizedSponsor
);
}
This means: certain sensitive operations cannot be completed by players alone, must go through game server verification.
OwnerCap: Player Operation Layer
public struct OwnerCap<phantom T> has key {
id: UID,
authorized_object_id: ID, // Only valid for this specific object
}
phantom T makes OwnerCap<Gate> and OwnerCap<StorageUnit> completely different types that cannot be mixed—this is type system level security guarantee.
Why Separate These Three Permission Layers?
You can think of them as three completely different responsibilities:
- GovernorCap Solves “world-level rules and global governance”
- AdminACL Solves “which servers or backend processes are trusted”
- OwnerCap Solves “which specific business entity can operate which facility”
Separating them has the biggest advantage: the system won’t mix “global governance rights” with “single facility operation rights” into one pot.
Otherwise you easily get this bad structure:
- One address is both server authorizer
- And all facility administrator
- And executor of certain temporary business
Once this address has problems, the entire system’s permission boundaries collapse.
11.3 Character as Keychain
All player’s OwnerCap are stored in the Character object, not sent directly to wallet address.
Player wallet address
└── Character (shared object, mapped to wallet address)
├── OwnerCap<NetworkNode> → Network node 0x...a1
├── OwnerCap<Gate> → Gate 0x...b2
├── OwnerCap<StorageUnit> → Storage box 0x...c3
└── OwnerCap<Gate> → Gate 0x...d4 (second gate)
Why this design?
- All asset ownership concentrated in Character, transferring Character equals transferring all assets
- Even if player changes wallet address, Character remains, assets aren’t lost
- Cooperates with alliance mechanisms for collective ownership management
One thing to note here:
Character isn’t just a simple wallet mapping layer, but a true permission container.
It organizes “people, characters, facilities, permissions” together across these dimensions:
- Wallet is signing entry
- Character is business entity
- OwnerCap is specific facility permissions
- Facility objects are controlled assets
The benefit of this is when you later do:
- Account migration
- Multi-sig control
- Alliance trusteeship
- Character transfer
You don’t need to rewrite an entire permission system, but make changes around the Character layer.
11.4 Complete Borrow-Use-Return Pattern
Executing any operation requiring OwnerCap must follow the “borrow → use → return” three-step atomic transaction:
// Character module provided interface
public fun borrow_owner_cap<T: key>(
character: &mut Character,
owner_cap_ticket: Receiving<OwnerCap<T>>, // Use Receiving pattern
ctx: &TxContext,
): (OwnerCap<T>, ReturnOwnerCapReceipt) // Return Cap + hot potato receipt
public fun return_owner_cap<T: key>(
character: &Character,
owner_cap: OwnerCap<T>,
receipt: ReturnOwnerCapReceipt, // Must consume receipt
)
ReturnOwnerCapReceipt is a hot potato (no Abilities), ensuring OwnerCap must be returned, cannot be lost outside transaction.
What Does This Pattern Really Prevent?
It’s not simply for “elegant writing,” but prevents several very real risks:
- High-privilege objects intercepted mid-transaction
- Scripts forget to return permissions, leaving dangling state
- Extension logic brings permission objects into wrong paths
- In multi-step operations, permission boundaries become no longer auditable
Forcing borrow -> use -> return into the same transaction is like adding a hard constraint to high-privilege operations:
You can temporarily use it to do things, but cannot take it away.
Why Pair with Hot Potato Receipt?
Because relying only on “developer consciously calling return” isn’t enough.
As long as the type system allows you to skip the return step, someone will eventually:
- Forget in scripts
- Delete during refactoring
- Directly
returnin error branches
After adding receipt, compiler and type system will force you to complete the process together.
Complete TypeScript Call Example
import { Transaction } from "@mysten/sui/transactions";
const WORLD_PKG = "0x...";
async function bringGateOnline(
tx: Transaction,
characterId: string,
ownerCapId: string,
gateId: string,
networkNodeId: string,
) {
// ① Borrow OwnerCap
const [ownerCap, receipt] = tx.moveCall({
target: `${WORLD_PKG}::character::borrow_owner_cap`,
typeArguments: [`${WORLD_PKG}::gate::Gate`],
arguments: [
tx.object(characterId),
tx.receivingRef({ objectId: ownerCapId, version: "...", digest: "..." }),
],
});
// ② Use OwnerCap: bring gate online
tx.moveCall({
target: `${WORLD_PKG}::gate::online`,
arguments: [
tx.object(gateId),
tx.object(networkNodeId),
tx.object(ENERGY_CONFIG_ID),
ownerCap,
],
});
// ③ Return OwnerCap (receipt consumed, hot potato makes this step unskippable)
tx.moveCall({
target: `${WORLD_PKG}::character::return_owner_cap`,
arguments: [tx.object(characterId), ownerCap, receipt],
});
}
11.5 Ownership Transfer Scenarios
Scenario 1: Transfer Control of Single Component
If you want to hand over control of one gate to an ally (but keep your Character and other facilities), you can transfer only the corresponding OwnerCap:
// Extract OwnerCap from your Character, send to ally
const tx = new Transaction();
// Extract OwnerCap (note this isn't borrowing, but transferring)
// Specific API subject to world contract, this is just conceptual
tx.moveCall({
target: `${WORLD_PKG}::character::transfer_owner_cap`,
typeArguments: [`${WORLD_PKG}::gate::Gate`],
arguments: [
tx.object(myCharacterId),
tx.object(ownerCapId),
tx.pure.address(allyAddress), // Ally's Character address
],
});
Scenario 2: Transfer Complete Character (All Assets Packaged Transfer)
Transferring entire Character object allows corresponding wallet address to control all bound assets. Suitable for alliance overall asset handover, account trading scenarios.
Need to distinguish three actions that sound similar but are completely different:
- Transfer single OwnerCap Only hand over control of one facility
- Transfer Character Hand over entire chain of permissions and assets
- Delegate operation Don’t transfer ownership, only give limited operation capability
If these three aren’t separated, your product design will quickly become messy.
For example, alliance treasury scenario:
- Property rights may belong to alliance entity
- Daily operation rights may belong to on-duty members
- Emergency shutdown rights may belong only to core administrators
This requires you can’t just use “one owner” to express all relationships.
Scenario 3: Delegate Operation (Without Transferring Ownership)
By writing extension contracts, you can allow specific addresses to operate your facilities in limited scope without transferring OwnerCap:
// In your extension contract, maintain an operator whitelist
public struct OperatorRegistry has key {
id: UID,
operators: Table<address, bool>,
}
public fun delegated_action(
registry: &OperatorRegistry,
ctx: &TxContext,
) {
// Verify caller is in operator list
assert!(registry.operators.contains(ctx.sender()), ENotOperator);
// ... execute operation
}
Easiest Pitfall in Delegation
Many people’s first delegation treats whitelist as “weakened ownership.” This isn’t enough.
A secure delegation design needs to answer at least:
- What actions can delegatee do, what can’t they do?
- Does delegation have time limits?
- Can delegation be revoked?
- Is delegation only valid for one facility?
- Can delegatee re-delegate?
If these boundaries aren’t written clearly, delegation becomes “invisible gifting rights” from “flexible authorization.”
11.6 OwnerCap Security Boundaries
Each OwnerCap Only Valid for One Object
public fun verify_owner_cap<T: key>(
obj: &T,
owner_cap: &OwnerCap<T>,
) {
// authorized_object_id ensures this OwnerCap can only be used for corresponding object
assert!(
owner_cap.authorized_object_id == object::id(obj),
EOwnerCapMismatch
);
}
This means if you have two gates, you have two OwnerCap<Gate>, they cannot be used interchangeably.
Why is authorized_object_id So Critical?
Because phantom T only solves “object categories cannot mix,” but hasn’t solved “same category different instances cannot mix.”
For example:
OwnerCap<Gate>can only be used for Gate, no problem- But without
authorized_object_idYour one Gate permission might incorrectly operate another Gate
So complete security boundaries are actually two layers:
- Type boundary
GateandStorageUnitcannot mix - Instance boundary This Gate and that Gate also cannot mix
Losing OwnerCap Means Losing Control
If Character containing OwnerCap is transferred, you lose control of all facilities. Please safeguard your Character object’s ownership private key.
From operational perspective, more accurately, you need to protect not “some button permission,” but the entire business control chain:
- Wallet signing rights
- Character control rights
- OwnerCap collection inside Character
- Critical delegation configurations and multi-sig settings
Once this chain breaks, recovery cost is very high.
11.7 Advanced: Multi-sig & Alliance Co-ownership
Through Sui’s multisig functionality, an alliance can jointly control critical facilities:
# Create 2/3 multi-sig address (requires 2 out of 3 members to agree to operate)
sui keytool multi-sig-address \
--pks <pk1> <pk2> <pk3> \
--weights 1 1 1 \
--threshold 2
Set Character’s control address to multi-sig address, alliance critical assets require multiple signatures to operate.
What’s Multi-sig Suitable For, What’s Not?
Multi-sig is very suitable for:
- Alliance treasury
- Ultra-high value infrastructure
- Critical parameter adjustments
- Upgrades & emergency shutdowns
Multi-sig not necessarily suitable for:
- High-frequency daily operations
- Player interactions requiring second-level response
- Large numbers of small repetitive management actions
So realistic practice usually isn’t “put everything on multi-sig,” but layer it:
- Core control rights on multi-sig
- Daily operational permissions released to execution layer through limited delegation
This is closer to real organizational structure.
Chapter Summary
| Concept | Key Points |
|---|---|
| Permission hierarchy | GovernorCap > AdminACL > OwnerCap |
| Character keychain | All OwnerCap centrally stored, transferring Character = transferring all assets |
| Borrow-Use-Return | Three-step atomic operation, ReturnReceipt (hot potato) ensures must return |
| Type safety | OwnerCap<Gate> ≠ OwnerCap<StorageUnit>, cannot mix |
| Delegate operations | Through extension contract + whitelist implementation, no need to transfer OwnerCap |
| Multi-sig | Sui native multi-sig address suitable for alliance co-ownership scenarios |
Further Reading
- Ownership Model Documentation
- Smart Character Documentation
- character.move Source Code
- Sui Multi-sig Documentation
- Receiving Object Pattern
Chapter 12: Advanced Move — Generics, Dynamic Fields, and Event Systems
Goal: Master generics programming in Move, dynamic field storage, Table/VecMap data structures, and event systems, enabling you to independently design complex on-chain data models.
Status: Advanced design chapter. Main content focuses on generics, dynamic fields, events, and Table/VecMap.
12.1 Generics
Generics allow your code to work with multiple types while maintaining type safety. This is widely used in EVE Frontier’s OwnerCap.
Basic Generic Syntax
// T is a type parameter, similar to <T> in other languages
public struct Box<T: store> has key, store {
id: UID,
value: T,
}
// Generic function
public fun wrap<T: store>(value: T, ctx: &mut TxContext): Box<T> {
Box { id: object::new(ctx), value }
}
public fun unwrap<T: store>(box: Box<T>): T {
let Box { id, value } = box;
id.delete();
value
}
Phantom Type Parameters
phantom T doesn’t actually hold a value of type T, only used for type distinction:
// T is not actually used, but creates type distinction
public struct OwnerCap<phantom T> has key {
id: UID,
authorized_object_id: ID,
}
// These two are completely different types, the system won't confuse them
let gate_cap: OwnerCap<Gate> = ...;
let ssu_cap: OwnerCap<StorageUnit> = ...;
Generics with Constraints
// T must have both key and store abilities
public fun transfer_to_object<T: key + store, Container: key>(
container: &mut Container,
value: T,
) { ... }
// T must have copy and drop (temporary value, not an asset)
public fun log_value<T: copy + drop>(value: T) { ... }
Why Are Generics Particularly Important in Move?
Because many safety designs in Move don’t rely on “passing a string to identify the type,” but instead put the type itself into the interface.
The advantages of this approach:
- Type mismatches can be detected at compile time
- Permissions and object categories can be tightly bound
- You don’t need to manually write fragile type checks at runtime
What Does phantom Really Solve?
When you first see phantom T, it’s easy to think it’s just a syntax trick. Actually, it solves:
“I don’t need to actually store a T, but I need this type identity to participate in security boundaries.”
This is especially common in permission objects, because what permissions really care about is often not the data itself, but “who this permission card is for.”
When Should You Use Generics, and When Shouldn’t You?
Scenarios suitable for generics:
- Permission objects
- Generic containers
- Same logic serving multiple object types
- The type itself carries security meaning
Scenarios not suitable for over-genericization:
- Business semantics are already very specific
- Only one or two fixed object types
- Generics would significantly increase interface reading cost
In other words, generics aren’t for “looking advanced,” but for clearly expressing “this logic is naturally generic.”
12.2 Dynamic Fields
Sui has a powerful feature: Dynamic Fields, which allow you to add arbitrary key-value pairs to objects at runtime, without needing to define all fields at compile time.
Why Do We Need Dynamic Fields?
Suppose your storage box needs to support any type of item, and the item types are unknown at compile time:
// ❌ Inflexible way: fixed fields
public struct Inventory has key {
id: UID,
fuel: Option<u64>,
ore: Option<u64>,
// Adding new item types requires modifying the contract...
}
// ✅ Flexible way: dynamic fields
public struct Inventory has key {
id: UID,
// No predefined fields, use dynamic fields for storage
}
Dynamic Fields API
use sui::dynamic_field as df;
use sui::dynamic_object_field as dof;
// Add dynamic field (value is not an object type)
df::add(&mut inventory.id, b"fuel_amount", 1000u64);
// Read dynamic field
let fuel: &u64 = df::borrow(&inventory.id, b"fuel_amount");
let fuel_mut: &mut u64 = df::borrow_mut(&mut inventory.id, b"fuel_amount");
// Check if exists
let exists = df::exists_(&inventory.id, b"fuel_amount");
// Remove dynamic field
let old_value: u64 = df::remove(&mut inventory.id, b"fuel_amount");
// Dynamic object field (value itself is an object with independent ObjectID)
dof::add(&mut storage.id, item_type_id, item_object);
let item = dof::borrow<u64, Item>(&storage.id, item_type_id);
let item = dof::remove<u64, Item>(&mut storage.id, item_type_id);
Real Application in EVE Frontier
The Ephemeral Inventory in storage units is implemented using dynamic fields:
// Create ephemeral inventory for a specific character (using character OwnerCap ID as key)
df::add(
&mut storage_unit.id,
owner_cap_id, // Use character's OwnerCap ID as key
EphemeralInventory::new(ctx),
);
// Character accesses their own ephemeral inventory
let my_inventory = df::borrow_mut<ID, EphemeralInventory>(
&mut storage_unit.id,
my_owner_cap_id,
);
The Real Value of Dynamic Fields
Its greatest value isn’t “avoiding struct definition changes,” but:
Allowing objects to grow new sub-states at runtime, without having to hardcode all slots in advance.
This is especially critical for game-type systems, because many states are naturally open sets:
- A warehouse might contain many types of items
- A facility might serve many characters
- A market might have continuously new listings
If you write them all as fixed fields, your structure will quickly lose control.
When to Use dynamic_field vs dynamic_object_field?
A very practical decision criterion:
- Value is just a simple value or ordinary struct
Use
dynamic_field - Value itself should also be an independent object
Use
dynamic_object_field
The latter is more suitable for:
- Needs independent object ID
- Needs to be transferred, referenced, or deleted separately
- May be operated on separately by other logic later
Most Common Mistakes with Dynamic Fields
1. Treating it as a “universal database”
Dynamic fields are very flexible, but not infinitely free. They bring:
- Higher read/write costs
- More complex index paths
- Higher debugging difficulty
2. Key design is too casual
If key design is unstable, you’ll encounter later:
- Can’t find original data for the same business entity
- Inconsistent mapping rules between off-chain and on-chain
- Data seems to be written successfully, but can’t be read back
3. Putting frequently traversed large collections directly in
Dynamic fields are suitable for locating by key, not naturally suited for high-frequency full traversal. As long as your business often needs to “scan all entries,” you need to start considering index and pagination strategies.
12.3 Table and VecMap: On-chain Collection Types
Table: Key-Value Mapping
use sui::table::{Self, Table};
public struct Registry has key {
id: UID,
members: Table<address, MemberInfo>,
}
// Add
table::add(&mut registry.members, member_addr, MemberInfo { ... });
// Query
let info = table::borrow(®istry.members, member_addr);
let info_mut = table::borrow_mut(&mut registry.members, member_addr);
// Existence check
let is_member = table::contains(®istry.members, member_addr);
// Remove
let old_info = table::remove(&mut registry.members, member_addr);
// Length
let count = table::length(®istry.members);
⚠️ Note: Each entry in a Table is an independent dynamic field on-chain, and each access has a separate cost. A transaction can access at most 1024 dynamic fields.
VecMap: Small-scale Ordered Mapping
use sui::vec_map::{Self, VecMap};
// VecMap is stored in object fields (not dynamic fields), suitable for small datasets
public struct Config has key {
id: UID,
toll_settings: VecMap<u64, u64>, // zone_id -> toll_amount
}
// Operations
vec_map::insert(&mut config.toll_settings, zone_id, amount);
let amount = vec_map::get(&config.toll_settings, &zone_id);
vec_map::remove(&mut config.toll_settings, &zone_id);
Selection Recommendations
| Scenario | Recommended Type |
|---|---|
| Large-scale, dynamically growing collections | Table |
| Less than 100 entries, needs traversal | VecMap or vector |
| Values are objects (with independent ObjectID) | dynamic_object_field |
| Values are simple values (u64, bool, etc.) | dynamic_field |
What Essentially Is Table?
It’s essentially not a “hash table in memory,” but an on-chain collection abstraction built on dynamic fields.
So when using Table, you should always remember three things:
- Each read/write has real on-chain cost
- The more entries, the more strategy needed for operations and troubleshooting
- It’s more like an “extensible index structure,” not a local container to use casually
Why Is VecMap Suitable for Small-scale Configuration?
Because it stores data directly in object fields, usually more suitable for:
- Small number of configuration items
- Needs full reading
- Needs traversal by insertion order or small scale
Typical examples include:
- Fee tier tables
- Small-scale whitelists
- Mode switch configurations
What to Really Ask When Choosing Types
Don’t just ask “can this container store it,” but ask:
- How large will this collection grow?
- Am I doing exact key lookups, or frequently traversing all?
- Are the values independent objects?
- Will I need to do pagination and indexing on it in the future?
Once these four questions are answered, container selection usually won’t be too off.
12.4 Event Systems
Events are the bridge between on-chain contracts and off-chain applications. Events are not stored in on-chain state, but are attached to transaction records and can be captured by indexers.
Defining and Emitting Events
use sui::event;
// Event struct: only needs copy + drop
public struct GateJumped has copy, drop {
gate_id: ID,
character_id: ID,
destination_gate_id: ID,
timestamp_ms: u64,
toll_paid: u64,
}
public struct ItemSold has copy, drop {
storage_unit_id: ID,
seller: address,
buyer: address,
item_type_id: u64,
price: u64,
}
// Emit event in function
public fun process_purchase(
storage_unit: &mut StorageUnit,
buyer: &Character,
payment: Coin<SUI>,
item_type_id: u64,
ctx: &mut TxContext,
): Item {
let price = coin::value(&payment);
// ... process purchase logic ...
// Emit event (no gas consumption difference, emission is free index recording)
event::emit(ItemSold {
storage_unit_id: object::id(storage_unit),
seller: storage_unit.owner_address,
buyer: ctx.sender(),
item_type_id,
price,
});
// ... return item ...
}
The most easily misunderstood aspect of events:
It’s a record of “what happened in the transaction,” not the source of truth for “what the current system state is.”
This sentence is very important. Because many frontend or index design problems start from treating events as state.
What Are Events Suitable for Expressing?
Most suitable for expressing:
- Something just happened
- Who triggered it
- What were the key parameters at the time
- What should off-chain systems do based on this subscription or notification
For example:
- Transaction records
- Jump records
- Claim triggers
- Authorization changes
What Should Events Not Independently Bear?
Not suitable for independently bearing:
- Current inventory truth
- Whether the current object is online
- Complete business state of a current facility
Because events are naturally timelines, not current state snapshots.
Listening to Events in TypeScript
import { SuiClient } from "@mysten/sui/client";
const client = new SuiClient({ url: "https://fullnode.testnet.sui.io:443" });
// Query historical events
const events = await client.queryEvents({
query: {
MoveEventType: `${MY_PACKAGE}::toll_gate_ext::GateJumped`,
},
limit: 50,
});
events.data.forEach(event => {
const fields = event.parsedJson as {
gate_id: string;
character_id: string;
toll_paid: string;
};
console.log(`Jump: ${fields.character_id} paid ${fields.toll_paid}`);
});
// Real-time subscription (WebSocket)
const unsubscribe = await client.subscribeEvent({
filter: { Package: MY_PACKAGE },
onMessage: (event) => {
console.log("New event:", event.type, event.parsedJson);
},
});
// Stop subscription
setTimeout(() => unsubscribe(), 60_000);
When Designing Events, How to Think About Fields?
A good event should at least answer:
- Who did it
- On which object
- What did they do
- What are the key business parameters
- How should off-chain systems locate related objects based on this
If there are too few fields, off-chain is hard to consume; too many fields will bloat the event and blur semantics.
A Very Practical Combination Principle
Mature on-chain systems usually adopt this combination:
- Objects Store current state
- Events Store historical actions
- Index layer Reorganize objects and events into data views that are easy for frontends to use
This is also why when you read GraphQL, indexer, and dApp chapters later, you’ll always see “object queries + event queries” appearing together.
Driving dApp Real-time Updates with Events
// src/hooks/useGateEvents.ts
import { useEffect, useState } from 'react'
import { SuiClient } from '@mysten/sui/client'
interface JumpEvent {
gate_id: string
character_id: string
toll_paid: string
timestamp_ms: string
}
export function useGateEvents(packageId: string) {
const [events, setEvents] = useState<JumpEvent[]>([])
useEffect(() => {
const client = new SuiClient({ url: 'https://fullnode.testnet.sui.io:443' })
const subscribe = async () => {
await client.subscribeEvent({
filter: { MoveEventType: `${packageId}::toll_gate_ext::GateJumped` },
onMessage: (event) => {
setEvents(prev => [event.parsedJson as JumpEvent, ...prev.slice(0, 49)])
},
})
}
subscribe()
}, [packageId])
return events
}
12.5 Dynamic Fields vs Events Use Cases
| Need | Solution |
|---|---|
| Persistent collection data storage | Dynamic fields / Table |
| Historical record queries (no need to keep in contract) | Events |
| Real-time notification to off-chain systems | Events |
| State checks within contracts | Dynamic fields |
| Analysis and statistical data (transaction volume, active users) | Events + off-chain indexing |
12.6 Practice: Designing a Trackable Auction State Machine
Integrating the knowledge from this chapter, design a complex auction state object:
module my_auction::auction;
use sui::object::{Self, UID, ID};
use sui::table::{Self, Table};
use sui::event;
use sui::clock::Clock;
/// Auction status enumeration (represented by u8)
const STATUS_OPEN: u8 = 0;
const STATUS_ENDED: u8 = 1;
const STATUS_CANCELLED: u8 = 2;
/// Auction object
public struct Auction<phantom ItemType: key + store> has key {
id: UID,
status: u8,
min_bid: u64,
current_bid: u64,
current_winner: Option<address>,
end_time_ms: u64,
bid_history_count: u64,
// Bid history stored with dynamic fields (avoid large objects)
}
/// Bid event
public struct BidPlaced has copy, drop {
auction_id: ID,
bidder: address,
amount: u64,
timestamp_ms: u64,
}
/// Bid function
public fun place_bid<T: key + store>(
auction: &mut Auction<T>,
payment: Coin<SUI>,
clock: &Clock,
ctx: &mut TxContext,
) {
let bid_amount = coin::value(&payment);
let now = clock.timestamp_ms();
// Verification
assert!(auction.status == STATUS_OPEN, EAuctionNotOpen);
assert!(now < auction.end_time_ms, EAuctionEnded);
assert!(bid_amount > auction.current_bid, EBidTooLow);
// Refund previous bidder's bid (simplified version)
// ...
// Update auction state
auction.current_bid = bid_amount;
auction.current_winner = option::some(ctx.sender());
// Record bid history (using dynamic fields)
let bid_key = auction.bid_history_count;
auction.bid_history_count = bid_key + 1;
df::add(&mut auction.id, bid_key, BidRecord {
bidder: ctx.sender(),
amount: bid_amount,
timestamp_ms: now,
});
// Emit event (for dApp real-time display)
event::emit(BidPlaced {
auction_id: object::id(auction),
bidder: ctx.sender(),
amount: bid_amount,
timestamp_ms: now,
});
}
🔖 Chapter Summary
| Knowledge Point | Core Points |
|---|---|
| Generics | <T> type parameter + phantom T type distinction |
| Dynamic Fields | Add fields at runtime, df::add/borrow/remove, max 1024/tx |
| Table | Large-scale on-chain KV storage, table::add/borrow/contains |
| VecMap | Small ordered KV, stored in fields, suitable for config tables |
| Events | has copy + drop, event::emit(), can be subscribed off-chain |
| Events vs Dynamic Fields | Temporary notifications use events; persistent state uses dynamic fields |
📚 Further Reading
Chapter 13: NFT Design & Metadata Management
Goal: Master Sui’s NFT standard (Display), design evolvable dynamic NFTs, and apply NFTs as permission credentials, achievement badges, and game assets in the EVE Frontier ecosystem.
Status: Advanced design chapter. Main content focuses on NFT standards, dynamic metadata, and Collection patterns.
13.1 Sui’s NFT Model
On Sui, an NFT is simply a unique object with the key ability. There’s no special “NFT contract” - any object with a unique ObjectID is naturally an NFT:
// Simplest NFT
public struct Badge has key, store {
id: UID,
name: vector<u8>,
description: vector<u8>,
image_url: vector<u8>,
}
The most important understanding isn’t “NFTs can display images,” but rather:
An NFT on Sui is first an object, and second a collectible or display item.
This means you can naturally use NFTs in three different scenarios:
- Pure display Badges, memorabilia, achievement proofs
- Permission-based Passes, membership cards, whitelist credentials
- Functional Upgradeable ships, equipment, subscriptions, rental certificates
The design priorities for these three types of NFTs are completely different.
Four questions to ask before designing an NFT
- Is it primarily a display item, permission card, or operational asset?
- Is it transferable?
- Will its metadata change?
- Should frontends and markets treat it as a “tradable commodity”?
If these four questions aren’t answered clearly, the subsequent Display, Collection, and TransferPolicy will be easy to misalign.
13.2 Sui Display Standard: Making NFTs Display Correctly Everywhere
The Display object tells wallets and markets how to display your NFT:
module my_nft::space_badge;
use sui::display;
use sui::package;
use std::string::utf8;
// One-time witness (create Publisher)
public struct SPACE_BADGE has drop {}
public struct SpaceBadge has key, store {
id: UID,
name: String,
tier: u8, // 1=Bronze, 2=Silver, 3=Gold
earned_at_ms: u64,
image_url: String,
}
fun init(witness: SPACE_BADGE, ctx: &mut TxContext) {
// 1. Use OTW to create Publisher (prove package author identity)
let publisher = package::claim(witness, ctx);
// 2. Create Display (define how to display SpaceBadge)
let mut display = display::new_with_fields<SpaceBadge>(
&publisher,
// Field name // Template value ({field_name} will be replaced by actual field value)
vector[
utf8(b"name"),
utf8(b"description"),
utf8(b"image_url"),
utf8(b"project_url"),
],
vector[
utf8(b"{name}"), // NFT name
utf8(b"EVE Frontier Builder Badge - Tier {tier}"), // Description
utf8(b"{image_url}"), // Image URL
utf8(b"https://evefrontier.com"), // Project link
],
ctx,
);
// 3. Submit Display (freeze version, make it externally visible)
display::update_version(&mut display);
// 4. Transfer (Publisher to deployer, Display shared or frozen)
transfer::public_transfer(publisher, ctx.sender());
transfer::public_freeze_object(display);
}
What does Display really solve?
It solves the interpretation layer problem between “on-chain object fields” and “wallet and market display content.”
Without this layer:
- Wallets can only see raw fields
- Markets have difficulty uniformly displaying name, description, image
- The same type of NFT will display inconsistently across different frontends
So Display isn’t decoration, it’s part of the NFT product experience.
Most common mistakes when designing Display
1. Stuffing all display semantics into on-chain fields
Not all display copy needs to be mutable on-chain fields. Some stable descriptions are better suited for templates, while some dynamic state is better suited for fields.
2. Over-relying on external image URLs
If image resource paths are unstable, the NFT itself still exists, but the user-visible experience will collapse.
3. Field naming disconnected from frontend understanding
If on-chain fields are named too internally, the frontend and wallet layer will have difficulty interpreting them stably.
13.3 Dynamic NFTs: Evolving Metadata
EVE Frontier’s game state changes in real-time, and your NFT metadata can change along with it:
module my_nft::evolving_ship;
/// Evolvable ship NFT
public struct EvolvingShip has key, store {
id: UID,
name: String,
hull_class: u8, // 0=Frigate, 1=Cruiser, 2=Battleship
combat_score: u64, // Combat points (increase with battles)
kills: u64, // Kill count
image_url: String, // Changes based on hull_class
}
/// Record combat result (called by turret contract)
public fun record_kill(
ship: &mut EvolvingShip,
ctx: &TxContext,
) {
ship.kills = ship.kills + 1;
ship.combat_score = ship.combat_score + 100;
// Upgrade ship level (evolution)
if ship.combat_score >= 10_000 && ship.hull_class < 2 {
ship.hull_class = ship.hull_class + 1;
// Update image URL (point to higher-level asset)
ship.image_url = get_image_url(ship.hull_class);
}
}
fun get_image_url(class: u8): String {
let base = b"https://assets.evefrontier.com/ships/";
let suffix = if class == 0 { b"frigate.png" }
else if class == 1 { b"cruiser.png" }
else { b"battleship.png" };
// Concatenate URL (string operations in Move use sui::string)
let mut url = std::string::utf8(base);
url.append(std::string::utf8(suffix));
url
}
Display template auto-updates: Since Display renders using current values of fields like {hull_class} and {image_url}, when fields change, the NFT’s display in wallets also updates immediately.
What are dynamic NFTs suited for and not suited for?
Suited for:
- Growth-oriented assets
- Items whose value is affected by state
- In-game combat records, achievements, proficiency mappings
Not necessarily suited for:
- Collectibles emphasizing static scarcity narratives
- Assets where the secondary market heavily relies on fixed metadata
Because once metadata is mutable, you’ve introduced new product issues by default:
- Who can modify it?
- Are changes auditable?
- When players buy in, are they buying the current state or a potentially changing future state?
Key boundaries of dynamic metadata design
- Is state change traceable on-chain Best to have event records
- Are modification permissions clear Not just any module can arbitrarily modify
- Can the frontend correctly reflect changes Otherwise on-chain changes while user interface stays on old image
13.4 Collection Pattern
module my_nft::badge_collection;
/// Badge series collection (meta-object, describes this NFT series)
public struct BadgeCollection has key {
id: UID,
name: String,
total_supply: u64,
minted_count: u64,
admin: address,
}
/// Individual badge
public struct AllianceBadge has key, store {
id: UID,
collection_id: ID, // Which collection it belongs to
serial_number: u64, // Series number (nth minted)
tier: u8,
attributes: vector<NFTAttribute>,
}
public struct NFTAttribute has store, copy, drop {
trait_type: String,
value: String,
}
/// Mint badge (track number and total)
public fun mint_badge(
collection: &mut BadgeCollection,
recipient: address,
tier: u8,
attributes: vector<NFTAttribute>,
ctx: &mut TxContext,
) {
assert!(ctx.sender() == collection.admin, ENotAdmin);
assert!(collection.minted_count < collection.total_supply, ESoldOut);
collection.minted_count = collection.minted_count + 1;
let badge = AllianceBadge {
id: object::new(ctx),
collection_id: object::id(collection),
serial_number: collection.minted_count,
tier,
attributes,
};
transfer::public_transfer(badge, recipient);
}
The value of Collection isn’t just “categorizing a batch of NFTs,” but making series management clear:
- Supply control
- Number tracking
- Official series identity
- Frontend aggregated display
What problems are Collections best suited to solve?
- Whether a certain series is sold out
- Which series does asset #N belong to
- Whether a badge comes from that official issuance system
Without this collection layer, doing these later becomes much harder:
- Series pages
- Rarity statistics
- Official certification
13.5 NFTs as Access Control Credentials
In EVE Frontier, NFTs are the most natural permission carriers:
// Using NFTs to check permissions
public fun enter_restricted_zone(
gate: &Gate,
character: &Character,
badge: &AllianceBadge, // Must hold badge to call
clock: &Clock,
ctx: &mut TxContext,
) {
// Verify badge tier (need gold badge to enter)
assert!(badge.tier >= 3, EInsufficientBadgeTier);
// Verify badge belongs to correct collection (prevent forgery)
assert!(badge.collection_id == OFFICIAL_COLLECTION_ID, EWrongCollection);
// ...
}
This is one of the most practical NFT use cases in EVE Builder, because it makes “permission” into an object that players can actually hold and understand.
Why are permission NFTs often better than address whitelists?
Because they’re more flexible and product-oriented:
- Can be transferred
- Can be revoked
- Can have tiers
- Can have expiration times
- Frontend can intuitively display them
But you must be careful of one thing:
As long as it’s transferable, the permission flows with it.
So you must first decide whether this permission NFT should be:
- A transferable market asset
- Or a non-transferable identity credential
13.6 NFT Transfer Policies
Sui supports flexible NFT transfer policies:
// Default: anyone can transfer (public_transfer)
transfer::public_transfer(badge, recipient);
// Lock-up: NFT can only be moved by specific contracts (via TransferPolicy)
use sui::transfer_policy;
// Establish TransferPolicy during package initialization (restrict transfer conditions)
fun init(witness: SPACE_BADGE, ctx: &mut TxContext) {
let publisher = package::claim(witness, ctx);
let (policy, policy_cap) = transfer_policy::new<SpaceBadge>(&publisher, ctx);
// Add custom rules (such as royalty payments)
// royalty_rule::add(&mut policy, &policy_cap, 200, 0); // 2% royalty
transfer::public_share_object(policy);
transfer::public_transfer(policy_cap, ctx.sender());
transfer::public_transfer(publisher, ctx.sender());
}
Transfer policy essentially defines “the social attributes of this NFT”
- Free transfer More like a commodity
- Restricted transfer More like a permit with rules
- Non-transferable More like identity or achievement
This isn’t a technical detail, it’s product positioning.
If your NFT is:
- Membership status
- Real-name credential
- Alliance internal identity card
Then default free transfer often isn’t a good idea.
13.7 Embedding NFTs in EVE Frontier Assets (Object Owns Object)
// Ship equipment NFT (owned by ship object)
public struct Equipment has key, store {
id: UID,
name: String,
stat_bonus: u64,
}
public struct Ship has key {
id: UID,
// Equipment embedded in Ship object (object owns object)
equipped_items: vector<Equipment>,
}
// Equip item to ship
public fun equip(
ship: &mut Ship,
equipment: Equipment, // Equipment moves from player wallet into Ship
ctx: &TxContext,
) {
vector::push_back(&mut ship.equipped_items, equipment);
}
Object-owns-object design is especially natural for game assets, because it allows you to express:
- A ship owns multiple pieces of equipment
- A character owns a set of certificates
- A container holds multiple special assets
When should NFTs exist independently vs. be embedded?
Suited for independent existence:
- Need to trade separately
- Need to display separately
- Need to authorize or transfer separately
Suited for embedding into other objects:
- Mainly as a component of a larger object
- Don’t need frequent independent circulation
- More emphasis on combined overall state
This is essentially balancing “tradability” and “compositional expressiveness.”
Chapter Summary
| Knowledge Point | Core Points |
|---|---|
| Sui NFT Essence | Unique object with key, ObjectID is NFT ID |
| Display Standard | display::new_with_fields() defines wallet display template |
| Dynamic NFT | Mutable fields + Display template references fields → auto-sync display |
| Collection Pattern | MetaObject tracks supply and numbering |
| NFT as Permission | Pass NFT reference for permission checks, more flexible than address whitelist |
| TransferPolicy | Control NFT secondary market transfer rules (such as royalties) |
Further Reading
Chapter 14: On-chain Economic System Design
Goal: Learn to design and implement complete on-chain economic systems in EVE Frontier, including custom token issuance, decentralized markets, dynamic pricing, and vault management.
Status: Advanced design chapter. Main content focuses on tokens, markets, vaults, and pricing mechanisms.
14.1 EVE Frontier’s Economic System
EVE Frontier itself already has two official currencies:
| Currency | Purpose | Features |
|---|---|---|
| LUX | In-game mainstream trading currency | Stable, used for daily services and commodity trading |
| EVE Token | Ecosystem token | Used for developer incentives, can purchase special assets |
As a Builder, you can:
- Accept LUX/SUI as payment methods (directly use official Coin types)
- Issue your own alliance token (custom Coin module)
- Build markets and trading mechanisms (based on SSU extensions)
The most important thing here isn’t the capabilities themselves of “being able to issue tokens and charge fees,” but first distinguishing:
What is your economic system actually selling, why would anyone continue to pay, and under what circumstances will it be arbitraged or drained.
Many on-chain economic designs fail not because the code was wrong, but because they never figured out these things from the start:
- Are you selling one-time items, ongoing services, or access qualifications?
- Is revenue settled immediately or distributed long-term?
- Who determines the price? Fixed, algorithmic, auction, or manual operation?
- Why would players keep assets in your system rather than use and leave?
First distinguish four most common Builder fee models
| Model | What users buy | Typical scenario | Risk point |
|---|---|---|---|
| One-time purchase | An item or one action | Vending machine, gate jump fee | Easily becomes pure price comparison market |
| Usage rights purchase | Access or capability for a period | Rental, subscription, pass | Expiration, refund, abuse boundary complex |
| Matchmaking commission | Platform traffic and transaction matching | Market, auction, insurance matching | Fake transactions, self-dealing, Sybil volume manipulation |
| Long-term vault distribution | Share of system cash flow | Alliance vault, protocol revenue distribution | Complex governance, large distribution disputes |
Before designing an economic system, you’d better first clarify which category you belong to. Because they correspond to completely different object models, event designs, and risk controls.
14.2 Issuing Custom Tokens (Custom Coin)
Sui’s token (Coin) model is very standardized. Through the sui::coin module, you can create any Fungible Token:
module my_alliance::alliance_token;
use sui::coin::{Self, Coin, TreasuryCap};
use sui::object::UID;
use sui::transfer;
use sui::tx_context::TxContext;
/// Token's "One-Time Witness"
/// Must match module name (all caps), can only be created during init
public struct ALLIANCE_TOKEN has drop {}
/// Token metadata (name, symbol, decimals)
fun init(witness: ALLIANCE_TOKEN, ctx: &mut TxContext) {
let (treasury_cap, coin_metadata) = coin::create_currency(
witness,
6, // Decimals
b"ALLY", // Token symbol
b"Alliance Token", // Token full name
b"The official token of Alliance X", // Description
option::none(), // Icon URL (optional)
ctx,
);
// Transfer TreasuryCap to deployer (minting rights)
transfer::public_transfer(treasury_cap, ctx.sender());
// Share CoinMetadata (for DEX, wallet display)
transfer::public_share_object(coin_metadata);
}
/// Mint tokens (only holder of TreasuryCap can call)
public fun mint(
treasury: &mut TreasuryCap<ALLIANCE_TOKEN>,
amount: u64,
recipient: address,
ctx: &mut TxContext,
) {
let coin = coin::mint(treasury, amount, ctx);
transfer::public_transfer(coin, recipient);
}
/// Burn tokens (reduce total supply)
public fun burn(
treasury: &mut TreasuryCap<ALLIANCE_TOKEN>,
coin: Coin<ALLIANCE_TOKEN>,
) {
coin::burn(treasury, coin);
}
Issuing tokens is technically simple, but economically most easily misused.
Three questions to ask before issuing tokens
1. Why does this token exist?
Common reasonable uses include:
- Alliance internal accounting and incentives
- Protocol internal discounts, revenue sharing, or voting credentials
- Some service quota or access layer
If the answer is just “everyone has tokens, so I’ll issue one too,” it’s probably not worth doing.
2. Does this token really need on-chain circulation?
Some points-based systems don’t actually need an independent coin, better suited for:
- On-chain scoring objects
- Non-transferable badges
- Vault share records
Because once you make it a truly transferable Coin, you’ve introduced by default:
- Secondary markets
- Hoarding and speculation
- Liquidity expectations
- Higher compliance and operational burden
3. Who controls supply, how does supply grow?
TreasuryCap technically represents minting rights, economically represents monetary sovereignty. As long as the supply strategy is vague, it’s easy to evolve into:
- Builder arbitrarily inflating
- Early users being diluted
- Price and expectations rapidly collapsing
What does One-Time Witness solve and not solve?
It solves:
- Coin type creation identity uniqueness
- Initialization path standardization
- Metadata and TreasuryCap creation process security
It doesn’t solve:
- Whether your supply curve is reasonable
- Whether the coin has demand
- Whether the coin price is stable
In other words, the language ensures “coins won’t be randomly forged,” but won’t ensure “you issued a good coin.”
14.3 Building Decentralized Markets
Based on Smart Storage Unit, you can build decentralized item markets:
module my_market::item_market;
use world::storage_unit::{Self, StorageUnit};
use world::character::Character;
use world::inventory::Item;
use sui::coin::{Self, Coin};
use sui::sui::SUI;
use sui::table::{Self, Table};
use sui::object::{Self, ID};
use sui::event;
/// Market extension Witness
public struct MarketAuth has drop {}
/// Item listing information
public struct Listing has store {
seller: address,
item_type_id: u64,
price: u64, // In MIST (SUI's smallest unit)
expiry_ms: u64, // 0 = never expires
}
/// Market registry
public struct Market has key {
id: UID,
storage_unit_id: ID,
listings: Table<u64, Listing>, // item_type_id -> Listing
fee_rate_bps: u64, // Fee (basis points, 100 bps = 1%)
fee_balance: Balance<SUI>,
}
/// Events
public struct ItemListed has copy, drop {
market_id: ID,
seller: address,
item_type_id: u64,
price: u64,
}
public struct ItemSold has copy, drop {
market_id: ID,
buyer: address,
seller: address,
item_type_id: u64,
price: u64,
fee: u64,
}
/// List item
public fun list_item(
market: &mut Market,
storage_unit: &mut StorageUnit,
character: &Character,
item_type_id: u64,
price: u64,
expiry_ms: u64,
ctx: &mut TxContext,
) {
// Withdraw item from storage box, store in market's dedicated temporary warehouse
// (Implementation detail: use MarketAuth{} to call SSU's withdraw_item)
// ...
// Record listing information
table::add(&mut market.listings, item_type_id, Listing {
seller: ctx.sender(),
item_type_id,
price,
expiry_ms,
});
event::emit(ItemListed {
market_id: object::id(market),
seller: ctx.sender(),
item_type_id,
price,
});
}
/// Buy item
public fun buy_item(
market: &mut Market,
storage_unit: &mut StorageUnit,
character: &Character,
item_type_id: u64,
mut payment: Coin<SUI>,
clock: &Clock,
ctx: &mut TxContext,
): Item {
let listing = table::borrow(&market.listings, item_type_id);
// Check expiration
if listing.expiry_ms > 0 {
assert!(clock.timestamp_ms() < listing.expiry_ms, EListingExpired);
}
// Verify payment amount
assert!(coin::value(&payment) >= listing.price, EInsufficientPayment);
// Deduct fee
let fee = listing.price * market.fee_rate_bps / 10_000;
let seller_amount = listing.price - fee;
// Split coins: fee + seller revenue + change
let fee_coin = payment.split(fee, ctx);
let seller_coin = payment.split(seller_amount, ctx);
let change = payment; // Remaining change
balance::join(&mut market.fee_balance, coin::into_balance(fee_coin));
transfer::public_transfer(seller_coin, listing.seller);
transfer::public_transfer(change, ctx.sender());
let seller_addr = listing.seller;
let price = listing.price;
// Remove listing record
table::remove(&mut market.listings, item_type_id);
event::emit(ItemSold {
market_id: object::id(market),
buyer: ctx.sender(),
seller: seller_addr,
item_type_id,
price,
fee,
});
// Withdraw item from SSU to buyer
storage_unit::withdraw_item(
storage_unit, character, MarketAuth {}, item_type_id, ctx,
)
}
This market example already illustrates the basic structure, but for real design you need to think through a few more things.
What are the minimum four layers a market contains?
- Order layer What seller provides, what price, when expires
- Custody layer Who holds items and funds, when are they actually transferred
- Settlement layer How are fees, seller revenue, change distributed
- Index layer How does frontend query current buyable list, not just historical events
Missing any of these four layers, you can “write code” but can’t write a stable market.
Most easily missed boundaries in market design
1. When listing, is the item actually locked?
If only Listing is recorded, but the item itself isn’t securely held:
- Seller might have already moved the item
- Frontend still shows “available for purchase”
- Buyer pays and finds out delivery is impossible
2. Are payment and delivery in the same atomic transaction?
If payment succeeds but delivery fails, or delivery succeeds but payment fails, both cause serious experience and asset issues. One core value of on-chain markets is putting these two actions into the same atomic transaction.
3. Are delist, expiration, re-listing paths closed?
Many markets don’t have problems with listing and purchasing, but rather:
- Expired entries still in list
- Inventory doesn’t return after delisting
- Re-listing causes state confusion
14.4 Dynamic Pricing Strategies
Strategy One: Fixed Price
Simplest pricing, Owner sets price, players buy at price (like the market example above).
Strategy Two: Dutch Auction (Decreasing Price)
public fun get_current_price(
start_price: u64,
end_price: u64,
start_time_ms: u64,
duration_ms: u64,
clock: &Clock,
): u64 {
let elapsed = clock.timestamp_ms() - start_time_ms;
if elapsed >= duration_ms {
return end_price // Reached minimum price
}
// Linear decrease
let price_drop = (start_price - end_price) * elapsed / duration_ms;
start_price - price_drop
}
Strategy Three: Supply-Demand Dynamic Pricing (AMM Style)
Based on constant product formula x * y = k:
public struct LiquidityPool has key {
id: UID,
reserve_sui: Balance<SUI>,
reserve_item_count: u64,
k_constant: u64, // x * y = k
}
/// Calculate how much SUI is needed to buy n items
public fun get_buy_price(pool: &LiquidityPool, buy_count: u64): u64 {
let new_item_count = pool.reserve_item_count - buy_count;
let new_sui_reserve = pool.k_constant / new_item_count;
new_sui_reserve - balance::value(&pool.reserve_sui)
}
Strategy Four: Member Discounts
public fun calculate_price(
base_price: u64,
buyer: address,
member_registry: &Table<address, MemberTier>,
): u64 {
if table::contains(member_registry, buyer) {
let tier = table::borrow(member_registry, buyer);
match (tier) {
MemberTier::Gold => base_price * 80 / 100, // 20% off
MemberTier::Silver => base_price * 90 / 100, // 10% off
_ => base_price,
}
} else {
base_price
}
}
Choosing pricing strategy is essentially balancing three things:
- Revenue maximization
- User predictability
- Anti-manipulation capability
Why will fixed pricing never go out of style?
Because it’s easiest to understand and easiest to operate.
Suited for:
- Low-frequency goods
- Services with stable price expectations
- Products just launched, haven’t grasped real demand curve yet
Many Builders want to implement complex pricing from the start, but actually the more stable path is usually:
- First use fixed pricing to establish real demand
- Then decide based on data whether to introduce dynamic mechanisms
What is Dutch auction suited for?
It’s suited for:
- Scarce resource initial sale
- You’re uncertain of market psychological price point
- Want price to automatically fall back over time
But you must accept one reality:
- It’s better suited for “single sale”
- Not necessarily suited for long-term stable operating stores
Why is AMM style both dangerous and powerful?
Powerful because:
- Continuously tradable
- Doesn’t depend on manual individual listings
- Price can automatically respond to inventory changes
Dangerous because:
- Players will be amplified by slippage and curve effects
- Easy to be arbitraged when parameters aren’t stable
- When pool depth is insufficient, price will look very bad
So if you’re not building a system that truly needs “continuous liquidity curves,” you don’t necessarily need AMM.
14.5 Vault Management Patterns
Every commercial facility should have a vault to manage revenue:
module my_finance::vault;
use sui::balance::{Self, Balance};
use sui::coin::{Self, Coin};
use sui::sui::SUI;
/// Multi-asset vault
public struct MultiVault has key {
id: UID,
sui_balance: Balance<SUI>,
total_deposited: u64, // Historical total deposited
total_withdrawn: u64, // Historical total withdrawn
}
/// Deposit funds
public fun deposit(vault: &mut MultiVault, coin: Coin<SUI>) {
let amount = coin::value(&coin);
vault.total_deposited = vault.total_deposited + amount;
balance::join(&mut vault.sui_balance, coin::into_balance(coin));
}
/// Distribute proportionally to multiple addresses
public fun distribute(
vault: &mut MultiVault,
recipients: vector<address>,
shares: vector<u64>, // Shares (percentage, total must equal 100)
ctx: &mut TxContext,
) {
assert!(vector::length(&recipients) == vector::length(&shares), EMismatch);
let total = balance::value(&vault.sui_balance);
let len = vector::length(&recipients);
let mut i = 0;
while (i < len) {
let share = *vector::borrow(&shares, i);
let payout = total * share / 100;
let coin = coin::take(&mut vault.sui_balance, payout, ctx);
transfer::public_transfer(coin, *vector::borrow(&recipients, i));
vault.total_withdrawn = vault.total_withdrawn + payout;
i = i + 1;
};
}
The focus of vault design has never been “putting money in,” but rather:
How revenue settles, who can move it, when to distribute, can you still audit after distribution.
A stable vault must answer at least these questions
- What asset is revenue denominated in?
- Are funds distributed in real-time or first settled then allocated?
- Who can withdraw? Who can pause? Who can change revenue share ratios?
- How to handle remainders and rounding errors during distribution?
- In case of disputes, can on-chain records be traced?
Trade-offs between “immediate distribution” and “vault first then settlement”
Immediate distribution
Pros:
- Logic intuitive
- Revenue immediately to all parties
Cons:
- Each transaction heavier
- More revenue paths, larger failure surface
Vault first then settlement
Pros:
- Main transaction lighter
- Revenue sharing, withdrawal, auditing easier to separate
Cons:
- Need to additionally handle withdrawal permissions and settlement timing
In most real products, the latter will be more stable.
14.6 Economic System Design Principles
| Principle | Practical advice |
|---|---|
| Sustainability | Design buyback mechanisms (such as using revenue to buyback and burn tokens) to avoid inflation |
| Transparency | All economic parameters queryable on-chain, record every transaction through events |
| Anti-manipulation | Avoid single-point price control, introduce AMM or Dutch auction |
| Incentive alignment | Make service providers (Builders) and users’ interests aligned |
| Upgrade retention | Key parameters (fee rates, prices) designed to be updatable, avoid contract lock-in |
Three more most underestimated principles
| Principle | Why important |
|---|---|
| Anti-volume manipulation | As long as your system has fee rebates, activity incentives, or leaderboards, someone will manipulate volume |
| Exit path | Whether players can unsubscribe, retrieve deposits, delist assets determines system trustworthiness |
| Parameter interpretability | When players can’t understand where prices and fees come from, they naturally distrust your protocol |
Attack surfaces you must proactively ask about during design
- Can players trade with themselves to farm rewards?
- Can whales instantly drain liquidity or manipulate prices?
- Can discounts and rebates be arbitraged in loops?
- During vault revenue distribution, can it be front-run or claimed multiple times?
If you write these questions out during the design phase, many vulnerabilities won’t make it into code.
Chapter Summary
| Knowledge Point | Core Points |
|---|---|
| Custom token | ALLIANCE_TOKEN one-time witness + coin::create_currency() |
| Decentralized market | SSU extension + Listing Table + fee mechanism |
| Pricing strategies | Fixed price / Dutch auction / AMM constant product / member discounts |
| Vault management | Balance<T> as internal ledger, proportional distribution |
| Economic design principles | Sustainable + transparent + anti-manipulation + upgradeable |
Further Reading
- Sui Coin Standard
- Move Book: Coin Module
- EVE Frontier Economic Design
- Chapter 12: Table & Dynamic Fields
Chapter 15: Cross-contract Composability
Goal: Master how to design externally friendly contract interfaces, and how to safely call contracts published by other Builders, building a composable EVE Frontier ecosystem.
Status: Advanced design chapter. Main content focuses on cross-contract interfaces and composability.
15.1 The Value of Composability
One of EVE Frontier’s most exciting features: Your contract can directly call others’ contracts, without any intermediary.
Builder A: Issued ALLY Token + price oracle
Builder B: Calls A's price oracle, prices items in ALLY Token for sale
Builder C: Lists on B's market, accepts both A's ALLY and SUI payment
This creates a truly open economic protocol stack.
The real power of composability isn’t the slogan “everyone can call each other,” but rather:
Once your protocol is clear enough, others can treat it as building blocks, not as a black box.
This directly changes Builder thinking:
- You’re no longer just building a single-point function
- You’re deciding whether to become a “terminal product” or “underlying capability”
Many of the most valuable protocols don’t do everything themselves, but make a certain capability into a module others are willing to repeatedly integrate.
15.2 Designing Externally Friendly Move Interfaces
Good Move interface design should follow:
module my_protocol::oracle;
// ── Public view functions (read-only, free to call) ──────────────────────
/// Get ALLY/SUI exchange rate (in MIST)
public fun get_ally_price(oracle: &PriceOracle): u64 {
oracle.ally_per_sui
}
/// Check if price is within validity period
public fun is_price_fresh(oracle: &PriceOracle, clock: &Clock): bool {
clock.timestamp_ms() - oracle.last_updated_ms < PRICE_TTL_MS
}
// ── Public composable functions (callable by other contracts) ───────────────────
/// Convert SUI amount to ALLY quantity
public fun sui_to_ally_amount(
oracle: &PriceOracle,
sui_amount: u64,
clock: &Clock,
): u64 {
assert!(is_price_fresh(oracle, clock), EPriceStale);
sui_amount * oracle.ally_per_sui / 1_000_000_000
}
Design Principles
| Principle | Implementation |
|---|---|
| Read-only views | public fun without &mut, zero Gas calls |
| Composable operations | Accept Witness parameters, allow authorized callers to execute |
| Versioning | Preserve old interfaces, distinguish new interfaces with new function names/type parameters |
| Event emission | Emit events for key operations, convenient for monitoring |
| Documentation | Complete comments explaining preconditions and return values |
Standards for good interfaces aren’t just “others can call through”
A truly externally friendly interface should at least allow external integrators to quickly answer these questions:
- Will this function modify state?
- What objects and permissions must be prepared before calling?
- What are the most common reasons for call failure?
- What do return values and events each represent?
If these aren’t clear, others can “theoretically call” but integration costs will be absurdly high.
Three most common mistakes in interface design
1. Directly exposing internal implementation details as external dependencies
Once your interface heavily depends on internal object layout, every refactoring will drag external integrators down with you.
2. Read interfaces and write interfaces mixed too closely
Read-only queries should be as simple and stable as possible. Writable entry points should clearly mark permissions and side effects. When the two are mixed together, integrators easily misuse.
3. Error boundaries unclear
If a function might fail due to:
- Insufficient permissions
- Stale data
- Invalid price
- Object state mismatch
Then these preconditions should ideally be exposed through documentation, naming, or auxiliary read-only interfaces.
15.3 Calling Other Builders’ Contracts
Adding External Dependencies in Move.toml
[dependencies]
# Depend on packages already published by other Builders (via Git)
AllyOracle = {
git = "https://github.com/builder-alice/ally-oracle",
subdir = "contracts",
rev = "v1.0.0"
}
# Or directly specify on-chain address (for published packages)
AllyOracleOnChain = { local = "../ally-oracle" } # For local testing
Calling in Move Code
module my_market::ally_market;
// Import other Builder's modules (need to declare dependency in Move.toml)
use ally_oracle::oracle::{Self, PriceOracle};
use ally_dao::ally_token::ALLY_TOKEN;
public fun buy_with_ally(
storage_unit: &mut world::storage_unit::StorageUnit,
character: &Character,
price_oracle: &PriceOracle, // External Builder A's price oracle
ally_payment: Coin<ALLY_TOKEN>, // External Builder A's token
item_type_id: u64,
clock: &Clock,
ctx: &mut TxContext,
): Item {
// Call external contract's view function
let price_in_sui = oracle::sui_to_ally_amount(
price_oracle,
ITEM_BASE_PRICE_SUI,
clock,
);
assert!(coin::value(&ally_payment) >= price_in_sui, EInsufficientPayment);
// Process ALLY Token payment (transfer to alliance vault, etc.)
// ...
// Withdraw item from own SSU
storage_unit::withdraw_item(
storage_unit, character, MyMarketAuth {}, item_type_id, ctx,
)
}
When depending on others’ contracts, what are you really binding to?
Not as simple as “a Git repository address,” but simultaneously binding to:
- Their interface stability
- Their upgrade strategy
- Their economic and governance choices
- Your own failure radius
In other words, every external protocol you introduce equals outsourcing part of your stability to others.
So ask four questions before integrating external protocols
- Are this protocol’s core interfaces stable?
- Will it break my current usage when upgrading?
- If it pauses or fails, do I have a fallback path?
- Can I converge key dependencies to read-only interfaces, rather than deep write coupling?
15.4 Interface Versioning & Protocol Standards
When your contract is widely used, upgrading interfaces must ensure backward compatibility:
module my_protocol::market_v2;
// Use types to mark versions
public struct V1 has drop {}
public struct V2 has drop {}
// V1 interface (always preserved)
public fun get_price_v1(market: &Market, _: V1): u64 {
market.price
}
// V2 interface (new, supports dynamic pricing)
public fun get_price_v2(
market: &Market,
clock: &Clock,
_: V2,
): u64 {
calculate_dynamic_price(market, clock)
}
Defining Cross-contract Interface Standards (Similar to ERC Standards)
In the EVE Frontier ecosystem, interface standards can be agreed upon through documentation, allowing multiple Builders’ contracts to be compatible:
// ── Unofficial "Market Interface" Standard Proposal ────────────────────────────
// Any Builder's contract wanting to integrate into aggregated markets should implement the following interfaces:
/// List items: return currently selling item types and prices
public fun list_items(market: &T): vector<(u64, u64)> // (type_id, price_sui)
/// Query whether specific item is available for purchase
public fun is_available(market: &T, item_type_id: u64): bool
/// Purchase (return item)
public fun purchase<Auth: drop>(
market: &mut T,
buyer: &Character,
item_type_id: u64,
payment: &mut Coin<SUI>,
auth: Auth,
ctx: &mut TxContext,
): Item
Why must version control be considered from the first version?
Because once others start depending on you, “changing interfaces” is no longer just your internal matter.
You must simultaneously consider:
- Whether old callers can still survive
- Whether new features can be gradually introduced
- Whether frontends, scripts, aggregators need to migrate synchronously
Many protocols don’t die from insufficient features, but from “version two breaking everything in version one.”
Where standardized interfaces are most valuable
Not looking professional, but catalyzing secondary ecosystems:
- Aggregators easier to integrate
- Price comparison tools easier to build
- Third-party frontends easier to reuse
- Other Builders more willing to build on top of you
15.5 Practice: Aggregated Price Comparator
// Aggregate multiple Builders' market prices in dApp
async function getAggregatedPrices(
itemTypeId: number,
marketIds: string[],
client: SuiClient,
): Promise<Array<{ marketId: string; price: number; builder: string }>> {
// Batch read all market states
const markets = await client.multiGetObjects({
ids: marketIds,
options: { showContent: true },
});
const prices = markets
.map((market, i) => {
const fields = (market.data?.content as any)?.fields;
if (!fields) return null;
// Read price from listings Table (simplified)
const listing = fields.listings?.fields?.contents?.find(
(entry: any) => Number(entry.fields?.key) === itemTypeId
);
if (!listing) return null;
return {
marketId: marketIds[i],
price: Number(listing.fields.value.fields.price),
builder: fields.owner ?? "Unknown",
};
})
.filter(Boolean)
.sort((a, b) => a!.price - b!.price); // Sort by price ascending
return prices as any[];
}
This example is well-suited to illustrate one reality:
The value of composability is often amplified off-chain.
In other words, as long as on-chain protocols design interfaces and events clearly, off-chain can create:
- Price comparators
- Aggregators
- Recommendation routing
- Strategy orchestration
So when designing contracts, don’t just think about “whether another on-chain contract will call me,” also think “whether off-chain tools will be willing to consume me.”
15.6 Composability Risks & Defense
| Risk | Description | Defense |
|---|---|---|
| Dependent contract upgrade | External contract upgrade may break your calls | Lock to specific version (rev = “v1.0.0”) |
| External contract pause | Dependent contract revoked or modified | Design fallback paths (fallback logic) |
| Reentrancy attacks | External contract callbacks to your contract | Move naturally defends through ownership system |
| Price manipulation | Dependent oracle manipulated | Use multiple oracles, take median |
Three more very common risks in real projects
| Risk | Description | Defense |
|---|---|---|
| Interface semantic drift | Function name unchanged, but behavior calibration changed | Constrain together with version numbers, documentation, and event semantics |
| External protocol alive, but data quality declines | Oracle not broken, just updates slower or price abnormal | Add freshness / sanity checks |
| Missing fallback path | When external dependency unavailable, own main process directly paralyzed | Preset fallback, pause switches, manual takeover paths |
Composition isn’t better the deeper it goes
The deeper the composition layers, the more capabilities you gain, but also the harder to maintain.
A practical principle is:
- Prioritize depending on stable, read-only, verifiable external capabilities
- Cautiously depend on deeply coupled, strong state-writing external processes
Because when the former breaks, it’s usually just “data gets worse,” when the latter breaks it might directly interrupt your core business chain.
Chapter Summary
| Knowledge Point | Core Points |
|---|---|
| Composability value | Your contract can be called by others, forming protocol stack |
| Interface design | Read-only views + Witness authorization + documentation comments |
| Reference external packages | Move.toml dependencies + use statements |
| Version control | Preserve old interfaces + type-marked versions |
| Aggregated dApp | Batch read multi-contract data, frontend aggregated display |
Further Reading
Chapter 16: Location and Proximity Systems
Goal: Understand EVE Frontier’s on-chain location privacy design, master how to build location-based game logic using the proximity system, and explore future ZK proof directions.
Status: Educational chapter. Main focus on location privacy, server proofs, and future ZK directions.
16.1 On-Chain Challenges for Spatial Games
In a traditional MMORPG, location information is managed centrally by game servers. On-chain, this creates two contradictions:
- Transparency: On-chain data is publicly viewable; if coordinates are stored in plaintext, all players’ hidden base locations are immediately exposed
- Trustworthiness: If locations are reported by clients, players can forge them (“I’m right next to you!”)
EVE Frontier’s solution: Hashed locations + trusted game server signatures.
What’s most important here isn’t memorizing the phrase “hashed location,” but first understanding what it’s balancing:
- Privacy Cannot expose base, facility, or player locations directly
- Verifiability Must allow certain distance-related actions to be proven
- Usability Cannot design the system so slowly that it’s unplayable
So the location system is essentially an engineering trade-off between “privacy, trust, and real-time performance.”
16.2 Hashed Locations: Protecting Coordinate Privacy
What’s stored on-chain isn’t plaintext coordinates, but hash values:
Storage: hash(x, y, salt) → chain.location_hash
Query: Anyone can only see the hash, cannot reverse-engineer coordinates
Verification: Players prove to the server "I know the coordinates for this hash"
// location.move (simplified)
public struct Location has store {
location_hash: vector<u8>, // Hash of coordinates, not plaintext
}
/// Update location (requires game server signature authorization)
public fun update_location(
assembly: &mut Assembly,
new_location_hash: vector<u8>,
admin_acl: &AdminACL, // Must be authorized server as sponsor
ctx: &TxContext,
) {
verify_sponsor(admin_acl, ctx);
assembly.location.location_hash = new_location_hash;
}
What Can Hashed Locations Protect, and What Can’t They?
They can protect:
- Plaintext coordinates aren’t exposed on-chain
- Normal observers cannot directly see real locations from object fields
They cannot automatically protect against:
- Reverse-engineering risks from weak hashes or enumerable spaces
- Off-chain interfaces leaking real locations
- Frontend or logs accidentally exposing mapping relationships
In other words, hashing is one layer of the privacy system, not the whole thing.
16.3 Proximity Verification: Server Signature Pattern
When verifying “A is near B” (e.g., picking up items, jumping), the current approach uses server signatures:
① Player requests game server: "Prove I'm near stargate 0x..."
② Server queries player's actual game coordinates
③ Server verifies player is indeed near the stargate (<20km)
④ Server signs a statement "Player A is near Stargate B" with private key
⑤ Player attaches this signature to the transaction
⑥ On-chain contract verifies signature is from authorized server (AdminACL)
The most critical trust boundary in this design is:
The chain doesn’t know the real coordinates; it only trusts “the authorized server has already judged this for it.”
This means system security depends not only on strict on-chain validation, but also on:
- Whether the game server is honest
- Whether the signature payload is complete
- Whether time windows and nonces are designed correctly
// Distance verification when linking stargates
public fun link_gates(
gate_a: &mut Gate,
gate_b: &mut Gate,
owner_cap_a: &OwnerCap<Gate>,
distance_proof: vector<u8>, // Server-signed proof of "distance > 20km between gates"
admin_acl: &AdminACL,
ctx: &TxContext,
) {
// Verify server signature (simplified; actual implementation verifies ed25519 signature)
verify_sponsor(admin_acl, ctx);
// ...
}
16.3.1 Recommended Minimal Proof Message Body
Don’t make “proximity proof” a black box byte string that only the server understands. The minimal viable payload should bind at least these fields:
{
"proof_type": "assembly_proximity",
"player": "0xPLAYER",
"assembly_id": "0xASSEMBLY",
"location_hash": "0xHASH",
"max_distance_m": 20000,
"issued_at_ms": 1735689600000,
"expires_at_ms": 1735689660000,
"nonce": "4d2f1c..."
}
Each field’s responsibility:
player: Prevents other players from reusing the proofassembly_id: Prevents using proof from stargate A to call stargate Blocation_hash: Binds current on-chain location state into the proofissued_at_ms/expires_at_ms: Limits replay windownonce: Prevents multiple replays within the same window
16.3.2 Minimal Loop Between Server-Side Signing and On-Chain Validation
Off-chain services must do at least two things: first verify real coordinate relationships, then sign an explicit payload.
type ProximityProofPayload = {
proofType: "assembly_proximity";
player: string;
assemblyId: string;
locationHash: string;
maxDistanceM: number;
issuedAtMs: number;
expiresAtMs: number;
nonce: string;
};
async function issueProximityProof(input: {
player: string;
assemblyId: string;
expectedHash: string;
}) {
const location = await getPlayerLocationFromGameServer(input.player);
const assembly = await getAssemblyLocation(input.assemblyId);
assert(hash(location) === input.expectedHash);
assert(distance(location, assembly) <= 20_000);
const payload: ProximityProofPayload = {
proofType: "assembly_proximity",
player: input.player,
assemblyId: input.assemblyId,
locationHash: input.expectedHash,
maxDistanceM: 20_000,
issuedAtMs: Date.now(),
expiresAtMs: Date.now() + 60_000,
nonce: crypto.randomUUID(),
};
return signPayload(payload);
}
On-chain side must validate at least four layers:
// Simplified pseudocode: real implementation should deserialize payload and compare field by field
public fun verify_proximity_proof(
assembly_id: ID,
expected_player: address,
expected_hash: vector<u8>,
proof_bytes: vector<u8>,
admin_acl: &AdminACL,
clock: &Clock,
ctx: &TxContext,
) {
verify_sponsor(admin_acl, ctx);
let payload = decode_proximity_payload(proof_bytes);
assert!(payload.assembly_id == assembly_id, EWrongAssembly);
assert!(payload.player == expected_player, EWrongPlayer);
assert!(payload.location_hash == expected_hash, EWrongLocationHash);
assert!(clock.timestamp_ms() <= payload.expires_at_ms, EProofExpired);
assert!(check_and_consume_nonce(payload.nonce), EReplay);
}
What’s truly important here is: verify_sponsor(admin_acl, ctx) only proves “this transaction came from an authorized server,” which isn’t enough to prove “this location statement itself is for the current object, current player, current time window.”
So What’s the Most Common Mistake in Location Proofs?
Not “getting the signature algorithm wrong,” but incomplete payload binding.
Once the payload misses binding one item, classic reuse problems emerge:
- Bound player but not object Player can use proof from A to call B
- Bound object but not time window Old proofs can be repeatedly replayed
- Bound time but not current location hash Old location can impersonate new location
16.4 Strategic Design Around Location Systems
Even though locations are hashed, Builders can still design many location-based mechanics:
Strategy One: Location Locking (Asset Bound to Location)
// Asset is only valid at specific location hash
public fun claim_resource(
claim: &mut ResourceClaim,
claimant_location_hash: vector<u8>, // Server-proven location
admin_acl: &AdminACL,
ctx: &mut TxContext,
) {
verify_sponsor(admin_acl, ctx);
// Verify player location hash matches resource point
assert!(
claimant_location_hash == claim.required_location_hash,
EWrongLocation,
);
// Grant resource
}
What’s truly interesting about location systems is: you don’t need to know plaintext coordinates to design very strong spatial rules.
This means Builders at the upper business layer usually don’t care about “exactly where you are in the universe,” but rather:
- Whether you’re near a certain facility
- Whether you’re within a certain region
- Whether you meet entry, extraction, activation conditions
This makes many mechanics feel more like “conditional access control” rather than “map rendering systems.”
Strategy Two: Base Zone Control
public struct BaseZone has key {
id: UID,
center_hash: vector<u8>, // Base center location hash
owner: address,
zone_nft_ids: vector<ID>, // List of friendly NFTs in this zone
}
// Authorize component only for players within base range
public fun base_service(
zone: &BaseZone,
service: &mut StorageUnit,
player_in_zone_proof: vector<u8>, // Server proof "player is within base range"
admin_acl: &AdminACL,
ctx: &mut TxContext,
) {
verify_sponsor(admin_acl, ctx);
// ...provide service
}
Strategy Three: Movement Path Tracking (Off-chain + On-chain Combined)
// Off-chain: Listen to player location update events
client.subscribeEvent({
filter: { MoveEventType: `${WORLD_PKG}::location::LocationUpdated` },
onMessage: (event) => {
const { assembly_id, new_hash } = event.parsedJson as any;
// Update local path records
locationHistory.push({ assembly_id, hash: new_hash, time: Date.now() });
},
});
// On-chain: Only store hash, parse path off-chain
16.5 Future Direction: Zero-Knowledge Proofs Replacing Server Trust
Official documentation mentions future plans to use ZK proofs to replace current server signatures:
Now:
Player → Server (where are you?) → Server signature → On-chain signature verification
Future (ZK):
Player → Local computation of ZK proof ("I know coordinates satisfying this hash, and < 20km")
→ On-chain ZK verifier (no server involvement)
Advantages of ZK Proofs:
- Fully decentralized, doesn’t depend on server honesty
- Players can prove “I’m here” without exposing exact coordinates
- Can theoretically prove arbitrarily complex spatial relationships
Practical Development Recommendations:
- During current phase, when integrating with servers, design payload structure, time windows, nonce, and object binding clearly (see Chapter 8)
AdminACL.verify_sponsor()can only serve as one layer of “source verification,” cannot replace payload validation- When ZK goes live in the future, try to only replace “proof mechanism,” don’t rewrite upper business state machines
Why Design Now for “Future Proof Mechanism Replaceability”?
Because what should really be stable is upper business semantics, not today’s proof implementation details.
In other words, you should split the system into two layers:
- Upper Business Rules E.g., “can only withdraw items when nearby”
- Lower Proof Mechanism E.g., today it’s server signatures, future might switch to ZK
This way when upgrading in the future, you’re replacing “how to prove,” not rewriting the entire business state machine.
16.5.1 Failure Scenarios and Defense Checklist
| Failure Scenario | Typical Cause | Minimal Defense |
|---|---|---|
| Proof replay | Payload lacks nonce or expiry time | Add nonce + short validity + on-chain consumption |
| Wrong object reuse | Proof doesn’t bind assembly_id | Payload strongly binds target object |
| Wrong person reuse | Proof doesn’t bind player | Payload strongly binds caller address |
| Old location reuse | Doesn’t bind location_hash | Write current on-chain hash into payload |
| Server clock drift | Expiry judgment inconsistent | Use on-chain Clock for final judgment |
Another Commonly Overlooked Failure Scenario: Off-Chain Cache Staleness
If the server gets an old location cache, it might sign a “formally legal, business-wise incorrect” proof.
So in real systems, you also need to consider:
- Whether server location data source is fresh enough
- Whether location sampling and on-chain state have significant delays
- Whether certain actions need shorter proof validity periods
16.6 Displaying Location Information in dApps
// Location information not directly readable to Builders (hashed), but can display in-game coordinates
// (by interfacing with game server API for decryption)
interface AssemblyDisplayInfo {
id: string
name: string
systemName: string // Star system name (from server API)
constellation: string // Constellation
region: string // Region
onlineStatus: string
}
async function getAssemblyDisplayInfo(assemblyId: string): Promise<AssemblyDisplayInfo> {
// 1. Read hashed location from chain
const obj = await suiClient.getObject({
id: assemblyId,
options: { showContent: true },
});
const locationHash = (obj.data?.content as any)?.fields?.location?.fields?.location_hash;
// 2. Query star system name via game server API using hash
const geoRes = await fetch(`${GAME_API}/location?hash=${locationHash}`);
const geoInfo = await geoRes.json();
return {
id: assemblyId,
name: (obj.data?.content as any)?.fields?.name,
systemName: geoInfo.system_name,
constellation: geoInfo.constellation,
region: geoInfo.region,
onlineStatus: (obj.data?.content as any)?.fields?.status,
};
}
When Displaying Locations in Frontend, Most Important Isn’t “How Detailed,” But “Not Leaking Unauthorized Information”
So frontends are usually better suited to display:
- Star system name
- Constellation
- Region
- Online status
Rather than carelessly displaying:
- Overly granular internal coordinates
- Debug fields that can be used to reverse-engineer precise locations
This is why location systems must be designed together with the off-chain display layer, not just considering hashing in contracts and calling it done.
🔖 Chapter Summary
| Knowledge Point | Core Takeaway |
|---|---|
| Hashed Location | Coordinates stored as hashes, prevents privacy leaks |
| Proximity Verification | Current: Server signatures → Future: ZK proofs |
| AdminACL Role | verify_sponsor() verifies server’s sponsor address |
| Builder Opportunities | Location locking, base zones, trajectory analysis |
| ZK Outlook | Fully decentralized spatial proofs without server trust |
📚 Further Reading
Chapter 17: Testing, Debugging, and Security Auditing
Goal: Write comprehensive unit tests for Move contracts, identify common security vulnerabilities, and formulate contract upgrade strategies.
Status: Engineering assurance chapter. Main focus on testing, security, and upgrade risk control.
17.1 Why Security Testing is Critical?
Once on-chain contracts are deployed, assets are real. Here are common loss scenarios:
- Price calculation overflow, leading to items sold at 0 price
- Missing permission checks, anyone can call “Owner only” functions
- Reentrancy vulnerabilities (less common in Move but still needs attention)
- Upgrade failures cause old data to be unreadable by new contracts
Defense Strategy: Test first, then publish.
The most valuable concept to establish here isn’t the platitude “testing is important,” but:
The goal of on-chain contract testing isn’t to prove it can run, but to prove it won’t lose control under wrong inputs, wrong sequences, and wrong permissions.
Many beginners write tests only verifying “normal path succeeds.” But real asset losses usually come from three other types of paths:
- Calls that shouldn’t succeed but do
- Boundary value inputs that push the system into abnormal states
- After upgrades or maintenance, old objects and new logic are no longer compatible
So for Builders, testing isn’t finishing work, it’s part of design work.
17.2 Move Unit Testing Basics
Move has a built-in testing framework, test code is written in the same .move file, marked with #[test] annotation:
module my_package::my_module;
// ... normal contract code ...
// Test module: only compiled in test environment
#[test_only]
module my_package::my_module_tests;
use my_package::my_module;
use sui::test_scenario::{Self, Scenario};
use sui::coin;
use sui::sui::SUI;
use sui::clock;
// ── Basic Test ─────────────────────────────────────────────
#[test]
fun test_deposit_and_withdraw() {
// Initialize test scenario (simulates blockchain state)
let mut scenario = test_scenario::begin(@0xALICE);
// Test step 1: Alice deploys contract
{
let ctx = scenario.ctx();
my_module::init_for_testing(ctx); // Test-specific init
};
// Test step 2: Alice deposits item
scenario.next_tx(@0xALICE);
{
let mut vault = scenario.take_shared<my_module::Vault>();
let ctx = scenario.ctx();
my_module::deposit(&mut vault, 100, ctx);
assert!(my_module::balance(&vault) == 100, 0);
test_scenario::return_shared(vault);
};
// Test step 3: Bob tries to withdraw (should fail)
scenario.next_tx(@0xBOB);
{
let mut vault = scenario.take_shared<my_module::Vault>();
// Expect this call to fail (abort)
// Use #[test, expected_failure] to test failure paths
test_scenario::return_shared(vault);
};
scenario.end();
}
// ── Test Failure Paths ────────────────────────────────────────
#[test]
#[expected_failure(abort_code = my_module::ENotOwner)]
fun test_unauthorized_withdraw_fails() {
let mut scenario = test_scenario::begin(@0xALICE);
// Deploy
{ my_module::init_for_testing(scenario.ctx()); };
// Bob tries to operate as Alice (should abort)
scenario.next_tx(@0xBOB);
{
let mut vault = scenario.take_shared<my_module::Vault>();
my_module::owner_withdraw(&mut vault, scenario.ctx()); // should abort
test_scenario::return_shared(vault);
};
scenario.end();
}
// ── Test Time-Related Logic with Clock ─────────────────────────
#[test]
fun test_time_based_pricing() {
let mut scenario = test_scenario::begin(@0xALICE);
let mut clock = clock::create_for_testing(scenario.ctx());
// Set current time
clock.set_for_testing(1_000_000);
{
let price = my_module::get_dutch_price(
1000, // starting price
100, // minimum price
0, // start time
2_000_000, // duration (2 seconds)
&clock,
);
// After half the time, price should be middle value
assert!(price == 550, 0);
};
clock.destroy_for_testing();
scenario.end();
}
Running tests:
# Run all tests
sui move test
# Run only specific test
sui move test test_deposit_and_withdraw
# Show verbose output
sui move test --verbose
When Writing Tests, First Categorize into Four Scenarios
A practical test stratification is:
- Normal Path Under valid inputs, does the system complete as expected
- Permission Failure Path Without permissions, does it stably abort
- Boundary Value Path Are 0, max value, expiry, empty collection, last entry scenarios correct
- State Evolution Path After completing one step, then the next, is the system still consistent
If your tests only have the first type, it’s not really enough to be called “tested.”
What is test_scenario Really Suited For?
It’s best suited to simulate:
- Multiple addresses taking turns initiating transactions
- Shared object state changes across multiple transactions
- Behavior changes after time progression
- Complete lifecycle of object creation, retrieval, return
This happens to be exactly where EVE Builder projects’ most common risk concentrations are.
Tests Aren’t Better When More Fragmented
Some tests are too fragmented, ultimately only proving “small functions work literally,” but don’t cover real business loops.
A more valuable approach is usually:
- Keep a few key unit tests
- Then write several end-to-end business scenario tests
For example, in a rental system, rather than only testing calc_refund(), more important is testing:
- Create listing
- Successfully rent
- Return early
- Expire and reclaim
Whether this complete chain is closed.
17.3 Common Security Vulnerabilities and Defenses
Vulnerability One: Integer Overflow/Underflow
// ❌ Dangerous: u64 subtraction underflow will abort, but if logic is wrong might calculate huge value
fun unsafe_calc(a: u64, b: u64): u64 {
a - b // If b > a, directly aborts (Move checks)
}
// ✅ Safe: Check before operating
fun safe_calc(a: u64, b: u64): u64 {
assert!(a >= b, EInsufficientBalance);
a - b
}
// ✅ For intentionally allowed underflow, use checked calculation
fun safe_pct(total: u64, bps: u64): u64 {
// bps max 10000, prevent total * bps overflow
assert!(bps <= 10_000, EInvalidBPS);
total * bps / 10_000 // Move u64 max 1.8e19, need to watch large numbers
}
✅ Move’s Advantage: Move checks u64 operation overflow by default, aborts on overflow rather than silently returning wrong values (unlike early Solidity versions).
But note, Move solves “machine-level overflow safety” for you, not “business math correctness.”
For example, these problems the type system won’t think about for you:
- Should fees be calculated then deducted, or deducted then profit-shared
- Should percentages round down or round to nearest
- After multi-address profit sharing, should remainder stay in vault or return to user
Many economic bugs ultimately aren’t “hacker-level vulnerabilities,” but settlement caliber itself designed wrong.
Vulnerability Two: Missing Permission Checks
// ❌ Dangerous: Doesn't verify caller
public fun withdraw_all(treasury: &mut Treasury, ctx: &mut TxContext) {
let all = coin::take(&mut treasury.balance, balance::value(&treasury.balance), ctx);
transfer::public_transfer(all, ctx.sender()); // Anyone can withdraw funds!
}
// ✅ Safe: Require OwnerCap
public fun withdraw_all(
treasury: &mut Treasury,
_cap: &TreasuryOwnerCap, // Check caller holds OwnerCap
ctx: &mut TxContext,
) {
let all = coin::take(&mut treasury.balance, balance::value(&treasury.balance), ctx);
transfer::public_transfer(all, ctx.sender());
}
The easiest mistake in permission checks is only verifying “some permission exists,” but not verifying:
- Is this permission for this object
- Is this call allowed in the current scenario
- Should this permission only be used in certain time periods or certain paths
Vulnerability Three: Capability Not Properly Bound
// ❌ Dangerous: OwnerCap doesn't verify corresponding object ID
public fun admin_action(vault: &mut Vault, _cap: &OwnerCap) {
// Any OwnerCap can control any Vault!
}
// ✅ Safe: Verify OwnerCap and object binding relationship
public fun admin_action(vault: &mut Vault, cap: &OwnerCap) {
assert!(cap.authorized_object_id == object::id(vault), ECapMismatch);
// ...
}
Vulnerability Four: Timestamp Manipulation
// ❌ Not recommended: Directly rely on ctx.epoch() as precise time
// epoch granularity is about 24 hours, not suitable for fine granularity timing
// ✅ Recommended: Use Clock object
public fun check_expiry(expiry_ms: u64, clock: &Clock): bool {
clock.timestamp_ms() < expiry_ms
}
Vulnerability Five: Shared Object Race Conditions
Shared objects can be accessed by multiple transactions concurrently. When multiple transactions simultaneously rush to buy the same item:
// ❌ Has race condition problem: Two transactions might both pass check
public fun buy_item(market: &mut Market, ...) {
let listing = table::borrow(&market.listings, item_type_id);
assert!(listing.amount > 0, EOutOfStock);
// ← Another TX might pass the same check here
// ... then both execute purchase, causing overselling
}
// ✅ Sui's solution: Write locks on shared objects ensure serialization
// Sui's Move executor guarantees: transactions writing to the same shared object execute sequentially
// So the above code is actually safe on Sui! But ensure your logic correctly handles negative stock
public fun buy_item(market: &mut Market, ...) {
// This check is atomic, other TXs will wait
assert!(table::contains(&market.listings, item_type_id), ENotListed);
let listing = table::remove(&mut market.listings, item_type_id); // Atomic removal
// ...
}
Although Sui serializes writes to shared objects, this doesn’t mean you can ignore business race conditions.
You still need to test:
- Same item purchased in rapid succession
- Object delisted then purchased
- Price updates and purchases happening in adjacent transactions
In other words, the underlying executor solves part of concurrent safety for you, but doesn’t design complete business consistency.
17.4 Using Move Prover for Formal Verification
Move Prover is a formal verification tool that can mathematically prove certain properties always hold:
// spec block: formal specification
spec fun total_supply_conserved(treasury: TreasuryCap<TOKEN>): bool {
// Declare: total supply increases by exact amount after minting
ensures result == old(total_supply(treasury)) + amount;
}
#[verify_only]
spec module {
// Invariant: vault balance never exceeds a certain limit
invariant forall vault: Vault:
balance::value(vault.balance) <= MAX_VAULT_SIZE;
}
Running verification:
sui move prove
When Is Move Prover Worth It?
Not all projects need to do formal verification from the start. A more practical strategy is usually:
- Normal cases and small-medium projects: First get unit tests and failure path coverage done well
- High-value vaults, liquidation, permission systems: Then introduce Prover to prove key invariants
Most suitable places for Prover usually include:
- Total supply conservation
- Balance never goes negative
- Certain permissions cannot be exceeded
- A certain state machine won’t jump to illegal states
17.5 Contract Upgrade Strategies
Move packages once published are immutable, but can publish new versions through upgrade mechanism:
# First publish
sui client publish
# Get UpgradeCap object (upgrade capability)
# Upgrade (requires UpgradeCap)
sui client upgrade \
--upgrade-capability <UPGRADE_CAP_ID> \
Upgrade Compatibility Rules
| Change Type | Allowed |
|---|---|
| Add new function | ✅ Allowed |
| Add new module | ✅ Allowed |
| Modify function logic (same signature) | ✅ Allowed |
| Modify function signature | ❌ Not allowed |
| Delete function | ❌ Not allowed |
| Modify struct fields | ❌ Not allowed |
| Add struct fields | ❌ Not allowed |
What’s Really Hard About Upgrades Isn’t the Command, But Data Staying Alive
Many people doing upgrades for the first time focus on “how to publish new package.” But what users really care about is:
- Can old objects still be used
- Can old frontend still read
- How to interpret old events and new objects together
In other words, upgrades are essentially maintaining a still-running system, not restarting the server.
Four Questions You Must Ask Before Upgrading
- Can old objects still be safely read by the new version?
- Does the new version require additional migration scripts?
- Does the frontend need to synchronously update field parsing?
- Once upgraded, if issues are found, is there a rollback or damage control path?
Data Migration Patterns
When needing to change data structures, use “old-new coexistence” strategy:
// v1: Old storage structure
public struct MarketV1 has key {
id: UID,
price: u64,
}
// v2: New version adds fields (cannot directly modify V1)
// Instead use dynamic fields to extend
public fun get_expiry_v2(market: &MarketV1): Option<u64> {
if df::exists_(&market.id, b"expiry") {
option::some(*df::borrow<vector<u8>, u64>(&market.id, b"expiry"))
} else {
option::none()
}
}
// Add new field to old object (migration script)
public fun migrate_add_expiry(
market: &mut MarketV1,
expiry_ms: u64,
ctx: &mut TxContext,
) {
df::add(&mut market.id, b"expiry", expiry_ms);
}
17.6 EVE Frontier-Specific Security Constraints
Quoting key constraints from official documentation:
| Constraint | Details |
|---|---|
| Object Size | Move objects max 250KB |
| Dynamic Fields | Single transaction can access max 1024 |
| Struct Fields | Single struct max 32 fields |
| Transaction Compute Limit | Exceeding compute limit will directly abort |
| Certain Admin Operations | Limited to game server signatures |
Don’t just treat these constraints as “documentation knowledge points.” They will directly affect your modeling approach.
For example:
- Objects have size limits, you can’t stuff all state into one giant object
- Dynamic fields have access limits, you can’t assume one transaction can scan the entire market
- Some operations depend on server signatures, you can’t design the system as purely user-driven
17.7 Security Checklist
Before publishing contracts, check each item:
Permission Control
✅ Do all write functions have permission verification?
✅ Does OwnerCap verify authorized_object_id?
✅ Do AdminACL-protected functions have sponsor verification?
Math Operations
✅ Can all multiplications overflow? (u64 max about 1.8 × 10^19)
✅ Do percentage calculations use bps (basis points) to avoid precision loss?
✅ Is a >= b checked before subtraction?
State Consistency
✅ Are deposit and withdrawal logic completely symmetric?
✅ Are hot potato objects always consumed?
✅ Are atomic operations on shared objects correct?
Upgrade Compatibility
✅ Is UpgradeCap storage planned securely?
✅ Is future data migration path designed?
Test Coverage
✅ Are normal paths tested?
✅ Are all assert failure paths tested?
✅ Are boundary values tested (0, max value)?
More Practical Checking Order
Each time before publishing, recommend going through in this order:
- Permissions Who can call, call what, what changes after calling
- Money Where money comes from, where it goes, can it possibly get lost along the way
- State After success and failure, do objects still maintain consistency
- Upgrade If this version needs changes later, will it lock itself up
This is more useful than purely checking off checklist items, because it forces you to re-examine design according to real risk surfaces.
🔖 Chapter Summary
| Knowledge Point | Core Takeaway |
|---|---|
| Move Testing Framework | test_scenario, #[test], #[expected_failure] |
| Overflow Safety | Move checks by default, but must correctly handle logic errors |
| Permission Checks | All write operations must verify Capability + object_id binding |
| Race Conditions | Sui writes to shared objects execute sequentially, atomic operations are safe |
| Contract Upgrades | UpgradeCap + compatibility rules + dynamic field migration |
| EVE Frontier Constraints | 250KB objects, 1024 dynamic fields/tx, 32 struct fields |
📚 Further Reading
Example 3: On-Chain Auction House (Smart Storage Unit + Dutch Auction)
Goal: Transform a Smart Storage Unit into a Dutch auction (price decreases over time), items automatically transfer to bidders, fully implementing auction contract + bidder dApp + Owner management panel.
Status: Includes contract, dApp, and Move test files. The main content is nearly a complete example, suitable as a “pricing strategy + frontend countdown” demonstration.
Code Directory
Minimal Call Chain
Owner creates auction -> Price decreases over time -> Buyer pays current price -> Auction settles -> Item transfers
Requirements Analysis
Scenario: You control a smart storage box containing rare ore. Instead of a fixed price, you want to maximize sales revenue through a Dutch auction (price descends from high to low) with more transparent price discovery:
- 🕐 Auction starts at 5000 LUX
- 📉 Drops 500 LUX every 10 minutes
- 🏆 Minimum price is 500 LUX, price stops dropping
- ⚡ Anyone can buy at the current price at any time, item immediately transfers
- 📊 dApp displays real-time countdown and current price
Part 1: Move Contract
Directory Structure
dutch-auction/
├── Move.toml
└── sources/
├── dutch_auction.move # Dutch auction logic
└── auction_manager.move # Auction management (create/end)
Core Contract: dutch_auction.move
module dutch_auction::auction;
use world::storage_unit::{Self, StorageUnit};
use world::character::Character;
use world::inventory::Item;
use sui::coin::{Self, Coin};
use sui::sui::SUI;
use sui::balance::{Self, Balance};
use sui::clock::Clock;
use sui::object::{Self, UID, ID};
use sui::event;
use sui::transfer;
/// SSU Extension Witness
public struct AuctionAuth has drop {}
/// Auction State
public struct DutchAuction has key {
id: UID,
storage_unit_id: ID, // Bound storage box
item_type_id: u64, // Item type being auctioned
start_price: u64, // Starting price (MIST)
end_price: u64, // Minimum price
start_time_ms: u64, // Auction start time
price_drop_interval_ms: u64, // Price drop interval (milliseconds)
price_drop_amount: u64, // Price drop amount per interval
is_active: bool, // Whether still ongoing
proceeds: Balance<SUI>, // Auction proceeds
owner: address, // Auction creator
}
/// Events
public struct AuctionCreated has copy, drop {
auction_id: ID,
item_type_id: u64,
start_price: u64,
end_price: u64,
}
public struct AuctionSettled has copy, drop {
auction_id: ID,
winner: address,
final_price: u64,
item_type_id: u64,
}
// ── Calculate Current Price ─────────────────────────────────────────
public fun current_price(auction: &DutchAuction, clock: &Clock): u64 {
if !auction.is_active {
return auction.end_price
}
let elapsed_ms = clock.timestamp_ms() - auction.start_time_ms;
let drops = elapsed_ms / auction.price_drop_interval_ms;
let total_drop = drops * auction.price_drop_amount;
if total_drop >= auction.start_price - auction.end_price {
auction.end_price // Already at minimum price
} else {
auction.start_price - total_drop
}
}
/// Calculate time remaining until next price drop (milliseconds)
public fun ms_until_next_drop(auction: &DutchAuction, clock: &Clock): u64 {
let elapsed = clock.timestamp_ms() - auction.start_time_ms;
let interval = auction.price_drop_interval_ms;
let next_drop_at = (elapsed / interval + 1) * interval;
next_drop_at - elapsed
}
// ── Create Auction ─────────────────────────────────────────────
public fun create_auction(
storage_unit: &StorageUnit,
item_type_id: u64,
start_price: u64,
end_price: u64,
price_drop_interval_ms: u64,
price_drop_amount: u64,
clock: &Clock,
ctx: &mut TxContext,
) {
assert!(start_price > end_price, EInvalidPricing);
assert!(price_drop_amount > 0, EInvalidDropAmount);
assert!(price_drop_interval_ms >= 60_000, EIntervalTooShort); // Minimum 1 minute
let auction = DutchAuction {
id: object::new(ctx),
storage_unit_id: object::id(storage_unit),
item_type_id,
start_price,
end_price,
start_time_ms: clock.timestamp_ms(),
price_drop_interval_ms,
price_drop_amount,
is_active: true,
proceeds: balance::zero(),
owner: ctx.sender(),
};
event::emit(AuctionCreated {
auction_id: object::id(&auction),
item_type_id,
start_price,
end_price,
});
transfer::share_object(auction);
}
// ── Bid: Pay Current Price to Get Item ──────────────────────────
public fun buy_now(
auction: &mut DutchAuction,
storage_unit: &mut StorageUnit,
character: &Character,
mut payment: Coin<SUI>,
clock: &Clock,
ctx: &mut TxContext,
): Item {
assert!(auction.is_active, EAuctionEnded);
let price = current_price(auction, clock);
assert!(coin::value(&payment) >= price, EInsufficientPayment);
// Return overpayment
let change_amount = coin::value(&payment) - price;
if change_amount > 0 {
let change = payment.split(change_amount, ctx);
transfer::public_transfer(change, ctx.sender());
}
// Revenue goes to auction treasury
balance::join(&mut auction.proceeds, coin::into_balance(payment));
auction.is_active = false;
event::emit(AuctionSettled {
auction_id: object::id(auction),
winner: ctx.sender(),
final_price: price,
item_type_id: auction.item_type_id,
});
// Withdraw item from SSU
storage_unit::withdraw_item(
storage_unit,
character,
AuctionAuth {},
auction.item_type_id,
ctx,
)
}
// ── Owner: Withdraw Auction Proceeds ──────────────────────────────────
public fun withdraw_proceeds(
auction: &mut DutchAuction,
ctx: &mut TxContext,
) {
assert!(ctx.sender() == auction.owner, ENotOwner);
assert!(!auction.is_active, EAuctionStillActive);
let amount = balance::value(&auction.proceeds);
let coin = coin::take(&mut auction.proceeds, amount, ctx);
transfer::public_transfer(coin, ctx.sender());
}
// ── Owner: Cancel Auction ──────────────────────────────────────
public fun cancel_auction(
auction: &mut DutchAuction,
storage_unit: &mut StorageUnit,
character: &Character,
ctx: &mut TxContext,
): Item {
assert!(ctx.sender() == auction.owner, ENotOwner);
assert!(auction.is_active, EAuctionAlreadyEnded);
auction.is_active = false;
// Return item to Owner
storage_unit::withdraw_item(
storage_unit, character, AuctionAuth {}, auction.item_type_id, ctx,
)
}
// Error codes
const EInvalidPricing: u64 = 0;
const EInvalidDropAmount: u64 = 1;
const EIntervalTooShort: u64 = 2;
const EAuctionEnded: u64 = 3;
const EInsufficientPayment: u64 = 4;
const EAuctionStillActive: u64 = 5;
const EAuctionAlreadyEnded: u64 = 6;
const ENotOwner: u64 = 7;
Part 2: Unit Tests
#[test_only]
module dutch_auction::auction_tests;
use dutch_auction::auction;
use sui::test_scenario;
use sui::clock;
use sui::coin;
use sui::sui::SUI;
#[test]
fun test_price_decreases_over_time() {
let mut scenario = test_scenario::begin(@0xOwner);
let mut clock = clock::create_for_testing(scenario.ctx());
// Set to time 0
clock.set_for_testing(0);
// Create mock auction object to test price calculation
let auction = auction::create_test_auction(
5000, // start_price
500, // end_price
600_000, // 10 minutes (ms)
500, // Drop 500 each time
&clock,
scenario.ctx(),
);
// Time 0: Price should be 5000
assert!(auction::current_price(&auction, &clock) == 5000, 0);
// After 10 minutes: Price should be 4500
clock.set_for_testing(600_000);
assert!(auction::current_price(&auction, &clock) == 4500, 0);
// After 90 minutes (9 drops × 500 = 4500, but minimum 500): Price should be 500
clock.set_for_testing(5_400_000);
assert!(auction::current_price(&auction, &clock) == 500, 0);
clock.destroy_for_testing();
auction.destroy_test_auction();
scenario.end();
}
#[test]
#[expected_failure(abort_code = auction::EInsufficientPayment)]
fun test_underpayment_fails() {
// ...Test failure path when payment is insufficient
}
Part 3: Bidder dApp
// src/AuctionApp.tsx
import { useState, useEffect, useCallback } from 'react'
import { useConnection, getObjectWithJson } from '@evefrontier/dapp-kit'
import { useDAppKit } from '@mysten/dapp-kit-react'
import { Transaction } from '@mysten/sui/transactions'
const DUTCH_PACKAGE = "0x_DUTCH_PACKAGE_"
const AUCTION_ID = "0x_AUCTION_ID_"
const STORAGE_UNIT_ID = "0x..."
const CHARACTER_ID = "0x..."
const CLOCK_OBJECT_ID = "0x6"
interface AuctionState {
start_price: string
end_price: string
start_time_ms: string
price_drop_interval_ms: string
price_drop_amount: string
is_active: boolean
item_type_id: string
}
function calculateCurrentPrice(state: AuctionState): number {
if (!state.is_active) return Number(state.end_price)
const now = Date.now()
const elapsed = now - Number(state.start_time_ms)
const drops = Math.floor(elapsed / Number(state.price_drop_interval_ms))
const totalDrop = drops * Number(state.price_drop_amount)
const maxDrop = Number(state.start_price) - Number(state.end_price)
if (totalDrop >= maxDrop) return Number(state.end_price)
return Number(state.start_price) - totalDrop
}
function msUntilNextDrop(state: AuctionState): number {
const now = Date.now()
const elapsed = now - Number(state.start_time_ms)
const interval = Number(state.price_drop_interval_ms)
return interval - (elapsed % interval)
}
export function AuctionApp() {
const { isConnected, handleConnect } = useConnection()
const dAppKit = useDAppKit()
const [auctionState, setAuctionState] = useState<AuctionState | null>(null)
const [currentPrice, setCurrentPrice] = useState(0)
const [countdown, setCountdown] = useState(0)
const [status, setStatus] = useState('')
const [isBuying, setIsBuying] = useState(false)
// Load auction state
const loadAuction = useCallback(async () => {
const obj = await getObjectWithJson(AUCTION_ID)
if (obj?.content?.dataType === 'moveObject') {
const fields = obj.content.fields as AuctionState
setAuctionState(fields)
}
}, [])
useEffect(() => {
loadAuction()
}, [loadAuction])
// Update price countdown every second
useEffect(() => {
if (!auctionState) return
const timer = setInterval(() => {
setCurrentPrice(calculateCurrentPrice(auctionState))
setCountdown(msUntilNextDrop(auctionState))
}, 1000)
return () => clearInterval(timer)
}, [auctionState])
const handleBuyNow = async () => {
if (!isConnected) { setStatus('Please connect wallet first'); return }
setIsBuying(true)
setStatus('⏳ Submitting transaction...')
try {
const tx = new Transaction()
const [paymentCoin] = tx.splitCoins(tx.gas, [
tx.pure.u64(currentPrice + 1_000) // Slightly more than current price, prevent last-second price changes
])
tx.moveCall({
target: `${DUTCH_PACKAGE}::auction::buy_now`,
arguments: [
tx.object(AUCTION_ID),
tx.object(STORAGE_UNIT_ID),
tx.object(CHARACTER_ID),
paymentCoin,
tx.object(CLOCK_OBJECT_ID),
],
})
const result = await dAppKit.signAndExecuteTransaction({ transaction: tx })
setStatus(`🏆 Bid successful! Tx: ${result.digest.slice(0, 12)}...`)
await loadAuction()
} catch (e: any) {
setStatus(`❌ ${e.message}`)
} finally {
setIsBuying(false)
}
}
const countdownSec = Math.ceil(countdown / 1000)
const priceInSui = (currentPrice / 1e9).toFixed(2)
const nextPriceSui = (
Math.max(Number(auctionState?.end_price ?? 0), currentPrice - Number(auctionState?.price_drop_amount ?? 0)) / 1e9
).toFixed(2)
return (
<div className="auction-app">
<header>
<h1>🔨 Dutch Auction House</h1>
{!isConnected
? <button onClick={handleConnect}>Connect Wallet</button>
: <span className="connected">✅ Connected</span>
}
</header>
{auctionState ? (
<div className="auction-board">
<div className="current-price">
<span className="label">Current Price</span>
<span className="price">{priceInSui} SUI</span>
</div>
<div className="countdown">
<span className="label">⏳ Drops to {nextPriceSui} SUI in {countdownSec}s</span>
<span className="next-price">{nextPriceSui} SUI</span>
</div>
<div className="info-row">
<span>Starting Price: {(Number(auctionState.start_price) / 1e9).toFixed(2)} SUI</span>
<span>Minimum Price: {(Number(auctionState.end_price) / 1e9).toFixed(2)} SUI</span>
</div>
{auctionState.is_active ? (
<button
className="buy-btn"
onClick={handleBuyNow}
disabled={isBuying || !isConnected}
>
{isBuying ? '⏳ Buying...' : `💰 Buy Now ${priceInSui} SUI`}
</button>
) : (
<div className="sold-banner">🎉 Sold</div>
)}
{status && <p className="tx-status">{status}</p>}
</div>
) : (
<div>Loading auction info...</div>
)}
</div>
)
}
🎯 Complete Review
Contract Layer
├── create_auction() → Creates shared DutchAuction object
├── current_price() → Calculates current price based on time (pure calculation, no state modification)
├── buy_now() → Payment → Revenue to treasury → Withdraw item from SSU → Emit event
├── cancel_auction() → Owner cancels, returns item
└── withdraw_proceeds() → Owner withdraws auction proceeds
dApp Layer
├── Recalculates price every second (pure frontend, no Gas consumption)
├── Countdown display for next price drop
└── One-click purchase, automatically attaches current price
🔧 Extension Exercises
- Support batch auctions: Auction multiple item types simultaneously, each with independent countdown
- Scheduled purchase: Players set target price, automatically trigger purchase when reached (off-chain monitoring + scheduled submission)
- Transaction history: Monitor
AuctionSettledevents to display recent transaction data
📚 Related Documentation
Example 6: Dynamic NFT Equipment System (Evolving Spaceship Weapons)
Goal: Create a spaceship weapon NFT system where attributes automatically upgrade based on combat results; utilize Sui Display standard to ensure NFTs display latest status in real-time across all wallets and marketplaces.
Status: Teaching example. Main content focuses on dynamic NFT and Display updates, complete directory at
book/src/code/example-06/.
Code Directory
Minimal Call Chain
Player holds weapon NFT -> Kill events accumulate -> Reaches threshold to upgrade -> Display metadata updates -> Wallet/marketplace displays new appearance
Requirements Analysis
Scenario: You designed a “growth weapon” system - players obtain a PlasmaRifle, initially an ordinary weapon, automatically upgrades appearance and attributes as kills accumulate:
- ⚪ Basic (0-9 kills): Plasma Rifle Mk.1, base damage
- 🔵 Elite (10-49 kills): Plasma Rifle Mk.2, image changes to elite version, damage +30%
- 🟡 Legendary (50+ kills): Plasma Rifle Mk.3 “Inferno”, image changes to legendary version, special effects
Part 1: NFT Contract
module dynamic_nft::plasma_rifle;
use sui::object::{Self, UID};
use sui::display;
use sui::package;
use sui::transfer;
use sui::event;
use std::string::{Self, String, utf8};
// ── One-Time Witness ─────────────────────────────────────────────
public struct PLASMA_RIFLE has drop {}
// ── Weapon Tier Constants ───────────────────────────────────────────
const TIER_BASIC: u8 = 1;
const TIER_ELITE: u8 = 2;
const TIER_LEGENDARY: u8 = 3;
const KILLS_FOR_ELITE: u64 = 10;
const KILLS_FOR_LEGENDARY: u64 = 50;
// ── Data Structures ───────────────────────────────────────────────
public struct PlasmaRifle has key, store {
id: UID,
name: String,
tier: u8,
kills: u64,
damage_bonus_pct: u64, // Damage bonus (percentage)
image_url: String,
description: String,
owner_history: u64, // Historical transfer count
}
public struct ForgeAdminCap has key, store {
id: UID,
}
// ── Events ──────────────────────────────────────────────────
public struct RifleEvolved has copy, drop {
rifle_id: ID,
from_tier: u8,
to_tier: u8,
total_kills: u64,
}
// ── Initialization ────────────────────────────────────────────────
fun init(witness: PLASMA_RIFLE, ctx: &mut TxContext) {
let publisher = package::claim(witness, ctx);
let keys = vector[
utf8(b"name"),
utf8(b"description"),
utf8(b"image_url"),
utf8(b"attributes"),
utf8(b"project_url"),
];
let values = vector[
utf8(b"{name}"),
utf8(b"{description}"),
utf8(b"{image_url}"),
// attributes concatenates multiple fields
utf8(b"[{\"trait_type\":\"Tier\",\"value\":\"{tier}\"},{\"trait_type\":\"Kills\",\"value\":\"{kills}\"},{\"trait_type\":\"Damage Bonus\",\"value\":\"{damage_bonus_pct}%\"}]"),
utf8(b"https://evefrontier.com/weapons"),
];
let mut display = display::new_with_fields<PlasmaRifle>(
&publisher, keys, values, ctx,
);
display::update_version(&mut display);
let admin_cap = ForgeAdminCap { id: object::new(ctx) };
transfer::public_transfer(publisher, ctx.sender());
transfer::public_freeze_object(display);
transfer::public_transfer(admin_cap, ctx.sender());
}
// ── Forge Initial Weapon ──────────────────────────────────────────
public fun forge_rifle(
_admin: &ForgeAdminCap,
recipient: address,
ctx: &mut TxContext,
) {
let rifle = PlasmaRifle {
id: object::new(ctx),
name: utf8(b"Plasma Rifle Mk.1"),
tier: TIER_BASIC,
kills: 0,
damage_bonus_pct: 0,
image_url: utf8(b"https://assets.example.com/weapons/plasma_mk1.png"),
description: utf8(b"A standard-issue plasma rifle. Prove yourself in combat."),
owner_history: 0,
};
transfer::public_transfer(rifle, recipient);
}
// ── Record Kill (called by turret extension)────────────────────────
public fun record_kill(
rifle: &mut PlasmaRifle,
ctx: &TxContext,
) {
rifle.kills = rifle.kills + 1;
check_and_evolve(rifle);
}
fun check_and_evolve(rifle: &mut PlasmaRifle) {
let old_tier = rifle.tier;
if rifle.kills >= KILLS_FOR_LEGENDARY && rifle.tier < TIER_LEGENDARY {
rifle.tier = TIER_LEGENDARY;
rifle.name = utf8(b"Plasma Rifle Mk.3 \"Inferno\"");
rifle.damage_bonus_pct = 60;
rifle.image_url = utf8(b"https://assets.example.com/weapons/plasma_legendary.png");
rifle.description = utf8(b"This weapon has bathed in the fires of a thousand battles. Its plasma burns with legendary fury.");
} else if rifle.kills >= KILLS_FOR_ELITE && rifle.tier < TIER_ELITE {
rifle.tier = TIER_ELITE;
rifle.name = utf8(b"Plasma Rifle Mk.2");
rifle.damage_bonus_pct = 30;
rifle.image_url = utf8(b"https://assets.example.com/weapons/plasma_mk2.png");
rifle.description = utf8(b"Battle-hardened and upgraded. The plasma cells burn hotter than standard.");
};
if old_tier != rifle.tier {
event::emit(RifleEvolved {
rifle_id: object::id(rifle),
from_tier: old_tier,
to_tier: rifle.tier,
total_kills: rifle.kills,
});
}
}
// ── Getter Functions ──────────────────────────────────────────────
public fun get_tier(rifle: &PlasmaRifle): u8 { rifle.tier }
public fun get_kills(rifle: &PlasmaRifle): u64 { rifle.kills }
public fun get_damage_bonus(rifle: &PlasmaRifle): u64 { rifle.damage_bonus_pct }
// ── Transfer Tracking (optional) ─────────────────────────────────────
// If using TransferPolicy, can track transfer count
// Simplified here via event monitoring
Part 2: Turret Extension - Combat Result Reports to Weapon
module dynamic_nft::turret_combat;
use dynamic_nft::plasma_rifle::{Self, PlasmaRifle};
use world::turret::{Self, Turret};
use world::character::Character;
public struct CombatAuth has drop {}
/// Turret kill event (called by turret extension)
public fun on_kill(
turret: &Turret,
killer: &Character,
weapon: &mut PlasmaRifle, // Player's weapon
ctx: &TxContext,
) {
// Verify legitimate turret extension call (requires CombatAuth)
turret::verify_extension(turret, CombatAuth {});
// Record kill to weapon
plasma_rifle::record_kill(weapon, ctx);
}
Part 3: Frontend Weapon Display dApp
// src/WeaponDisplay.tsx
import { useState, useEffect } from 'react'
import { useCurrentClient } from '@mysten/dapp-kit-react'
import { useRealtimeEvents } from './hooks/useRealtimeEvents'
const DYNAMIC_NFT_PKG = "0x_DYNAMIC_NFT_PACKAGE_"
interface RifleData {
name: string
tier: string
kills: string
damage_bonus_pct: string
image_url: string
description: string
}
const TIER_COLORS = {
'1': '#9CA3AF', // Gray (basic)
'2': '#3B82F6', // Blue (elite)
'3': '#F59E0B', // Gold (legendary)
}
const TIER_LABELS = { '1': 'Basic', '2': 'Elite', '3': 'Legendary' }
export function WeaponDisplay({ rifleId }: { rifleId: string }) {
const client = useCurrentClient()
const [rifle, setRifle] = useState<RifleData | null>(null)
const [justEvolved, setJustEvolved] = useState(false)
const loadRifle = async () => {
const obj = await client.getObject({
id: rifleId,
options: { showContent: true },
})
if (obj.data?.content?.dataType === 'moveObject') {
setRifle(obj.data.content.fields as RifleData)
}
}
useEffect(() => { loadRifle() }, [rifleId])
// Monitor evolution events
const evolutions = useRealtimeEvents<{
rifle_id: string; from_tier: string; to_tier: string; total_kills: string
}>(`${DYNAMIC_NFT_PKG}::plasma_rifle::RifleEvolved`)
useEffect(() => {
const myEvolution = evolutions.find(e => e.rifle_id === rifleId)
if (myEvolution) {
setJustEvolved(true)
loadRifle() // Reload latest data
setTimeout(() => setJustEvolved(false), 5000)
}
}, [evolutions])
if (!rifle) return <div className="loading">Loading weapon data...</div>
const tierColor = TIER_COLORS[rifle.tier as keyof typeof TIER_COLORS]
const tierLabel = TIER_LABELS[rifle.tier as keyof typeof TIER_LABELS]
const killsForNextTier = rifle.tier === '1'
? 10 : rifle.tier === '2' ? 50 : null
const progress = killsForNextTier
? Math.min(100, (Number(rifle.kills) / killsForNextTier) * 100) : 100
return (
<div className="weapon-card" style={{ borderColor: tierColor }}>
{justEvolved && (
<div className="evolution-banner">
✨ Weapon Evolved!
</div>
)}
<div className="weapon-image-container">
<img
src={rifle.image_url}
alt={rifle.name}
className={`weapon-image tier-${rifle.tier}`}
/>
<span className="tier-badge" style={{ background: tierColor }}>
{tierLabel}
</span>
</div>
<div className="weapon-info">
<h2>{rifle.name}</h2>
<p className="description">{rifle.description}</p>
<div className="stats">
<div className="stat">
<span>⚔️ Kills</span>
<strong>{rifle.kills}</strong>
</div>
<div className="stat">
<span>💥 Damage Bonus</span>
<strong>+{rifle.damage_bonus_pct}%</strong>
</div>
</div>
{killsForNextTier && (
<div className="evolution-progress">
<span>Evolution Progress: {rifle.kills} / {killsForNextTier} kills</span>
<div className="progress-bar">
<div
className="progress-fill"
style={{ width: `${progress}%`, background: tierColor }}
/>
</div>
</div>
)}
{!killsForNextTier && (
<div className="max-tier-badge">👑 Max Tier Reached</div>
)}
</div>
</div>
)
}
🎯 Complete Review
Contract Layer
├── plasma_rifle.move
│ ├── PlasmaRifle (NFT object, fields update with combat)
│ ├── Display (template references fields → wallet auto-syncs display)
│ ├── forge_rifle() ← Owner mints and distributes
│ ├── record_kill() ← Turret contract calls
│ └── check_and_evolve() ← Internal: check threshold, upgrade fields + emit event
│
└── turret_combat.move
└── on_kill() ← Turret kill triggers weapon upgrade
dApp Layer
└── WeaponDisplay.tsx
├── Subscribe to RifleEvolved event (refresh immediately on evolution)
├── Dynamic color theme (by tier)
└── Evolution progress bar
🔧 Extension Exercises
- Weapon Durability: Each use decreases
durabilityfield, quality degrades and damage reduces when low (needs repair) - Special Attributes: Legendary tier randomly gains special affixes (using random numbers + dynamic fields)
- Weapon Fusion: Two Elite weapons destroyed → mint one Legendary (material consumption upgrade)
📚 Related Documentation
Example 7: Gate Logistics Network (Multi-Hop Routing System)
Goal: Build a logistics network where an alliance owns multiple gates, supporting “A → B → C” multi-hop routing, off-chain calculates optimal path, on-chain atomically executes multiple jumps; with route planning dApp.
Status: Teaching example. Current example focuses on multi-hop routing and off-chain planning architecture, emphasizing unified interfaces rather than individual Move contracts.
Code Directory
Minimal Call Chain
Off-chain calculates optimal route -> Builds multi-hop PTB -> On-chain atomically executes all jumps -> All succeed or all rollback
Requirements Analysis
Scenario: Your alliance controls 5 interconnected gates, forming the following topology:
Mining Area ──[Gate1]──► Hub Alpha ──[Gate2]──► Trade Hub
│
[Gate3]
│
Refinery ──[Gate4]──► Manufacturing
│
[Gate5]
│
Safe Harbor
Requirements:
- Players can purchase “multi-hop passes” in one transaction, completing composite routes like A→Hub Alpha→Trade Hub
- Route calculation done off-chain (saves Gas)
- On-chain atomic execution: all jumps succeed or all rollback
- dApp provides visual route planner
Part 1: Multi-Hop Routing Contract
module logistics::multi_hop;
use world::gate::{Self, Gate};
use world::character::Character;
use sui::coin::{Self, Coin};
use sui::sui::SUI;
use sui::clock::Clock;
use sui::object::{Self, ID};
use sui::event;
public struct LogisticsAuth has drop {}
/// Purchase multi-hop route in one transaction
public fun purchase_route(
source_gate: &Gate,
hop1_dest: &Gate, // First hop destination
hop2_source: &Gate, // Second hop source (= hop1_dest's linked gate)
hop2_dest: &Gate, // Second hop destination
character: &Character,
mut payment: Coin<SUI>, // Payment for both hops' total toll
clock: &Clock,
ctx: &mut TxContext,
) {
// Verify route continuity: hop1_dest and hop2_source must be linked gates
assert!(
gate::are_linked(hop1_dest, hop2_source),
ERouteDiscontinuous,
);
// Calculate and deduct toll for each hop
let hop1_toll = get_toll(source_gate);
let hop2_toll = get_toll(hop2_source);
let total_toll = hop1_toll + hop2_toll;
assert!(coin::value(&payment) >= total_toll, EInsufficientPayment);
// Return change
let change = payment.split(coin::value(&payment) - total_toll, ctx);
if coin::value(&change) > 0 {
transfer::public_transfer(change, ctx.sender());
} else { coin::destroy_zero(change); }
// Issue two JumpPermits (valid for 1 hour)
let expires = clock.timestamp_ms() + 60 * 60 * 1000;
gate::issue_jump_permit(
source_gate, hop1_dest, character, LogisticsAuth {}, expires, ctx,
);
gate::issue_jump_permit(
hop2_source, hop2_dest, character, LogisticsAuth {}, expires, ctx,
);
// Collect tolls
let hop1_coin = payment.split(hop1_toll, ctx);
let hop2_coin = payment;
collect_toll(source_gate, hop1_coin, ctx);
collect_toll(hop2_source, hop2_coin, ctx);
event::emit(RouteTicketIssued {
character_id: object::id(character),
gates: vector[object::id(source_gate), object::id(hop1_dest), object::id(hop2_dest)],
total_toll,
});
}
/// General N-hop routing (accepts variable-length routes)
public fun purchase_route_n_hops(
gates: vector<&Gate>, // Gate list [A, B, C, D, ...]
character: &Character,
mut payment: Coin<SUI>,
clock: &Clock,
ctx: &mut TxContext,
) {
let n = vector::length(&gates);
assert!(n >= 2, ETooFewGates);
assert!(n <= 6, ETooManyHops); // Prevent overly large transactions
// Verify route continuity (each adjacent destination/source pair must be linked)
let mut i = 1;
while (i < n - 1) {
assert!(
gate::are_linked(vector::borrow(&gates, i), vector::borrow(&gates, i)),
ERouteDiscontinuous,
);
i = i + 1;
};
// Calculate total toll
let mut total: u64 = 0;
let mut j = 0;
while (j < n - 1) {
total = total + get_toll(vector::borrow(&gates, j));
j = j + 1;
};
assert!(coin::value(&payment) >= total, EInsufficientPayment);
// Issue all Permits
let expires = clock.timestamp_ms() + 60 * 60 * 1000;
let mut k = 0;
while (k < n - 1) {
gate::issue_jump_permit(
vector::borrow(&gates, k),
vector::borrow(&gates, k + 1),
character,
LogisticsAuth {},
expires,
ctx,
);
k = k + 1;
};
// Refund change
let change = payment.split(coin::value(&payment) - total, ctx);
if coin::value(&change) > 0 {
transfer::public_transfer(change, ctx.sender());
} else { coin::destroy_zero(change); }
// Process payment to each gate treasury...
}
fun get_toll(gate: &Gate): u64 {
// Read toll from gate's extension data (dynamic field)
// Simplified version: fixed rate
10_000_000_000 // 10 SUI
}
fun collect_toll(gate: &Gate, coin: Coin<SUI>, ctx: &TxContext) {
// Transfer coin to gate's corresponding Treasury
// ...
}
public struct RouteTicketIssued has copy, drop {
character_id: ID,
gates: vector<ID>,
total_toll: u64,
}
const ERouteDiscontinuous: u64 = 0;
const EInsufficientPayment: u64 = 1;
const ETooFewGates: u64 = 2;
const ETooManyHops: u64 = 3;
Part 2: Off-Chain Path Planning (Dijkstra)
// lib/routePlanner.ts
interface Gate {
id: string
name: string
linkedGates: string[] // Linked gate ID list
tollAmount: number // Toll (SUI)
}
interface Route {
gateIds: string[]
totalToll: number
hops: number
}
// Dijkstra shortest path (weighted by toll)
export function findCheapestRoute(
gateMap: Map<string, Gate>,
fromId: string,
toId: string,
): Route | null {
const dist = new Map<string, number>()
const prev = new Map<string, string | null>()
const unvisited = new Set(gateMap.keys())
for (const id of gateMap.keys()) {
dist.set(id, Infinity)
prev.set(id, null)
}
dist.set(fromId, 0)
while (unvisited.size > 0) {
// Find unvisited node with minimum distance
let current: string | null = null
let minDist = Infinity
for (const id of unvisited) {
if ((dist.get(id) ?? Infinity) < minDist) {
minDist = dist.get(id)!
current = id
}
}
if (!current || current === toId) break
unvisited.delete(current)
const gate = gateMap.get(current)!
for (const neighborId of gate.linkedGates) {
const neighbor = gateMap.get(neighborId)
if (!neighbor || !unvisited.has(neighborId)) continue
const newDist = (dist.get(current) ?? 0) + neighbor.tollAmount
if (newDist < (dist.get(neighborId) ?? Infinity)) {
dist.set(neighborId, newDist)
prev.set(neighborId, current)
}
}
}
if (dist.get(toId) === Infinity) return null // Unreachable
// Reconstruct path
const path: string[] = []
let cur: string | null = toId
while (cur) {
path.unshift(cur)
cur = prev.get(cur) ?? null
}
return {
gateIds: path,
totalToll: dist.get(toId) ?? 0,
hops: path.length - 1,
}
}
Part 3: Route Planner dApp
// src/RoutePlannerApp.tsx
import { useState, useEffect } from 'react'
import { useConnection } from '@evefrontier/dapp-kit'
import { useDAppKit } from '@mysten/dapp-kit-react'
import { findCheapestRoute } from '../lib/routePlanner'
import { Transaction } from '@mysten/sui/transactions'
const LOGISTICS_PKG = "0x_LOGISTICS_PACKAGE_"
// Gate network topology (usually read from on-chain)
const GATE_NETWORK = new Map([
['gate_mining', { id: 'gate_mining', name: 'Mining Entry', linkedGates: ['gate_hub_alpha'], tollAmount: 5 }],
['gate_hub_alpha', { id: 'gate_hub_alpha', name: 'Hub Alpha', linkedGates: ['gate_mining', 'gate_trade', 'gate_refinery'], tollAmount: 3 }],
['gate_trade', { id: 'gate_trade', name: 'Trade Hub', linkedGates: ['gate_hub_alpha'], tollAmount: 8 }],
['gate_refinery', { id: 'gate_refinery', name: 'Refinery', linkedGates: ['gate_hub_alpha', 'gate_manufacturing', 'gate_harbor'], tollAmount: 4 }],
['gate_manufacturing', { id: 'gate_manufacturing', name: 'Manufacturing', linkedGates: ['gate_refinery'], tollAmount: 6 }],
['gate_harbor', { id: 'gate_harbor', name: 'Safe Harbor', linkedGates: ['gate_refinery'], tollAmount: 2 }],
])
export function RoutePlannerApp() {
const { isConnected, handleConnect } = useConnection()
const dAppKit = useDAppKit()
const [from, setFrom] = useState('')
const [to, setTo] = useState('')
const [route, setRoute] = useState<{gateIds: string[]; totalToll: number; hops: number} | null>(null)
const [status, setStatus] = useState('')
const planRoute = () => {
if (!from || !to) return
const result = findCheapestRoute(GATE_NETWORK, from, to)
setRoute(result)
}
const purchaseRoute = async () => {
if (!route || route.gateIds.length < 2) return
const tx = new Transaction()
// Prepare payment (total toll + 5% buffer to prevent price changes)
const totalSui = Math.ceil(route.totalToll * 1.05) * 1e9
const [paymentCoin] = tx.splitCoins(tx.gas, [tx.pure.u64(totalSui)])
// Build gate parameter list
const gateArgs = route.gateIds.map(id => tx.object(id))
// Call multi-hop routing contract
if (route.hops === 2) {
tx.moveCall({
target: `${LOGISTICS_PKG}::multi_hop::purchase_route`,
arguments: [
gateArgs[0], gateArgs[1], gateArgs[1], gateArgs[2],
tx.object('CHARACTER_ID'),
paymentCoin,
tx.object('0x6'),
],
})
}
try {
setStatus('⏳ Purchasing route pass...')
const result = await dAppKit.signAndExecuteTransaction({ transaction: tx })
setStatus(`✅ Route purchased successfully! Tx: ${result.digest.slice(0, 12)}...`)
} catch (e: any) {
setStatus(`❌ ${e.message}`)
}
}
return (
<div className="route-planner">
<h1>🗺 Gate Logistics Route Planner</h1>
<div className="planner-inputs">
<div>
<label>Departure Gate</label>
<select value={from} onChange={e => setFrom(e.target.value)}>
<option value="">Select departure...</option>
{[...GATE_NETWORK.values()].map(g => (
<option key={g.id} value={g.id}>{g.name}</option>
))}
</select>
</div>
<div className="arrow">→</div>
<div>
<label>Destination Gate</label>
<select value={to} onChange={e => setTo(e.target.value)}>
<option value="">Select destination...</option>
{[...GATE_NETWORK.values()].map(g => (
<option key={g.id} value={g.id}>{g.name}</option>
))}
</select>
</div>
<button onClick={planRoute} disabled={!from || !to || from === to}>
📍 Plan Route
</button>
</div>
{route && (
<div className="route-result">
<h3>Optimal Route (Lowest Cost)</h3>
<div className="route-path">
{route.gateIds.map((id, i) => (
<>
<span key={id} className="gate-node">
{GATE_NETWORK.get(id)?.name}
</span>
{i < route.gateIds.length - 1 && (
<span className="arrow-icon">→</span>
)}
</>
))}
</div>
<div className="route-stats">
<span>🔀 Jumps: {route.hops}</span>
<span>💰 Total Cost: {route.totalToll} SUI</span>
</div>
<button
className="purchase-btn"
onClick={purchaseRoute}
disabled={!isConnected}
>
{isConnected ? '🚀 One-Click Purchase Full Route Pass' : 'Please Connect Wallet'}
</button>
</div>
)}
{route === null && from && to && from !== to && (
<p className="no-route">⚠️ No route found from {from} to {to}</p>
)}
{status && <p className="status">{status}</p>}
</div>
)
}
🎯 Complete Review
Contract Layer
├── multi_hop.move
│ ├── purchase_route() → Two-hop quick version (specifies 4 gate parameters)
│ ├── purchase_route_n_hops() → N-hop general version (vector parameters, max 6 hops)
│ └── LogisticsAuth {} → Gate extension Witness
Off-Chain Path Planning
└── routePlanner.ts
└── findCheapestRoute() → Dijkstra, weighted by toll
dApp Layer
└── RoutePlannerApp.tsx
├── Dropdown select departure/destination
├── Call Dijkstra to display optimal route
└── One-click purchase full route pass
🔧 Extension Exercises
- Shortest Hop Routing: Implement second mode (prioritize reducing hops rather than cost)
- Real-Time Congestion Awareness: Monitor GateJumped events, calculate last 5 minutes traffic per gate, route to avoid congestion
- Item Escort Insurance: Can purchase additional “item loss insurance” NFT when buying route, compensate on failure
📚 Related Documentation
Example 9: Cross-Builder Protocol Aggregator Market
Goal: Design a “protocol adapter” layer allowing users to access multiple different Builder-published market contracts (despite varying interfaces) in one dApp, achieving DEX aggregator-like experience.
Status: Teaching example. Current example focuses on aggregator architecture and adapter layering, emphasizing unified interfaces rather than individual Move contracts.
Code Directory
Minimal Call Chain
Frontend queries multiple markets -> Adapter normalizes quotes -> Select optimal market -> Submit purchase per corresponding protocol
Requirements Analysis
Scenario: EVE Frontier ecosystem already has 3 different Builder market contracts:
| Builder | Contract Address | Interface Style |
|---|---|---|
| Builder Alice | 0xAAA... | buy_item(market, character, item_id, coin) |
| Builder Bob | 0xBBB... | purchase(storage, char, type_id, payment, ctx) |
| You (Builder You) | 0xYYY... | buy_item_v2(market, character, item_id, coin, clock, ctx) |
Players want to find which market has the cheapest item and buy with one click.
Part 1: Off-Chain Adapter Layer (TypeScript)
Since different contracts have different interfaces, adapters run off-chain, encapsulating differences into a unified interface:
// lib/marketAdapters.ts
import { Transaction } from "@mysten/sui/transactions"
import { SuiClient } from "@mysten/sui/client"
export interface MarketListing {
marketId: string
builder: string
itemTypeId: number
price: number // SUI
adapterName: string
}
// ── Adapter Interface ─────────────────────────────────────────────
export interface MarketAdapter {
name: string
packageId: string
// Query item price in this market
getPrice(client: SuiClient, itemTypeId: number): Promise<number | null>
// Build purchase transaction
buildBuyTx(
tx: Transaction,
itemTypeId: number,
characterId: string,
paymentCoin: any
): void
}
// ── Adapter A: Builder Alice's Market ────────────────────────
export const AliceMarketAdapter: MarketAdapter = {
name: "Alice's Market",
packageId: "0xAAA...",
async getPrice(client, itemTypeId) {
// Alice's market uses Table to store listings, key is item_id
const obj = await client.getDynamicFieldObject({
parentId: "0xAAA_MARKET_ID",
name: { type: "u64", value: itemTypeId.toString() },
})
const fields = (obj.data?.content as any)?.fields
return fields ? Number(fields.price) / 1e9 : null
},
buildBuyTx(tx, itemTypeId, characterId, paymentCoin) {
tx.moveCall({
target: `0xAAA...::market::buy_item`,
arguments: [
tx.object("0xAAA_MARKET_ID"),
tx.object(characterId),
tx.pure.u64(itemTypeId),
paymentCoin,
],
})
},
}
// ── Adapter B: Builder Bob's Market ──────────────────────────
export const BobMarketAdapter: MarketAdapter = {
name: "Bob's Depot",
packageId: "0xBBB...",
async getPrice(client, itemTypeId) {
// Bob's market uses different struct, price field named 'cost'
const obj = await client.getObject({
id: "0xBBB_STORAGE_ID",
options: { showContent: true },
})
const listings = (obj.data?.content as any)?.fields?.listings?.fields?.contents
const found = listings?.find((e: any) => Number(e.fields?.key) === itemTypeId)
return found ? Number(found.fields.value.fields.cost) / 1e9 : null
},
buildBuyTx(tx, itemTypeId, characterId, paymentCoin) {
tx.moveCall({
target: `0xBBB...::depot::purchase`,
arguments: [
tx.object("0xBBB_STORAGE_ID"),
tx.object(characterId),
tx.pure.u64(itemTypeId),
paymentCoin,
],
})
},
}
// ── Adapter C: Your Own Market ────────────────────────────────
export const MyMarketAdapter: MarketAdapter = {
name: "Your Market",
packageId: "0xYYY...",
async getPrice(client, itemTypeId) {
// Your market has complete documentation, most straightforward reading
const obj = await client.getDynamicFieldObject({
parentId: "0xYYY_MARKET_ID",
name: { type: "u64", value: itemTypeId.toString() },
})
const fields = (obj.data?.content as any)?.fields
return fields ? Number(fields.value.fields.price) / 1e9 : null
},
buildBuyTx(tx, itemTypeId, characterId, paymentCoin) {
tx.moveCall({
target: `0xYYY...::market::buy_item_v2`,
arguments: [
tx.object("0xYYY_MARKET_ID"),
tx.object(characterId),
tx.pure.u64(itemTypeId),
paymentCoin,
tx.object("0x6"), // Clock (V2 adds this parameter)
],
})
},
}
// ── Aggregate Price Query ──────────────────────────────────────────
const ALL_ADAPTERS = [AliceMarketAdapter, BobMarketAdapter, MyMarketAdapter]
export async function aggregatePrices(
client: SuiClient,
itemTypeId: number,
): Promise<MarketListing[]> {
const results = await Promise.all(
ALL_ADAPTERS.map(async (adapter) => {
const price = await adapter.getPrice(client, itemTypeId).catch(() => null)
if (price === null) return null
return {
marketId: adapter.packageId,
builder: adapter.name,
itemTypeId,
price,
adapterName: adapter.name,
} as MarketListing
})
)
return results
.filter((r): r is MarketListing => r !== null)
.sort((a, b) => a.price - b.price) // Sort by price ascending
}
Part 2: Aggregated Purchase dApp
// src/AggregatedMarket.tsx
import { useState, useEffect } from 'react'
import { useConnection } from '@evefrontier/dapp-kit'
import { useDAppKit } from '@mysten/dapp-kit-react'
import { useCurrentClient } from '@mysten/dapp-kit-react'
import { Transaction } from '@mysten/sui/transactions'
import { aggregatePrices, MyMarketAdapter, BobMarketAdapter, AliceMarketAdapter, MarketListing } from '../lib/marketAdapters'
const ADAPTERS_MAP = {
[AliceMarketAdapter.packageId]: AliceMarketAdapter,
[BobMarketAdapter.packageId]: BobMarketAdapter,
[MyMarketAdapter.packageId]: MyMarketAdapter,
}
const ITEM_TYPES = [
{ id: 101, name: 'Rare Ore' },
{ id: 102, name: 'Shield Module' },
{ id: 103, name: 'Thruster' },
]
export function AggregatedMarket() {
const { isConnected, handleConnect } = useConnection()
const client = useCurrentClient()
const dAppKit = useDAppKit()
const [selectedItem, setSelectedItem] = useState<number | null>(null)
const [listings, setListings] = useState<MarketListing[]>([])
const [loading, setLoading] = useState(false)
const [status, setStatus] = useState('')
const searchListings = async (itemTypeId: number) => {
setSelectedItem(itemTypeId)
setLoading(true)
try {
const results = await aggregatePrices(client, itemTypeId)
setListings(results)
} finally {
setLoading(false)
}
}
const buyFromMarket = async (listing: MarketListing) => {
if (!isConnected) { setStatus('Please connect wallet first'); return }
setStatus('⏳ Building transaction...')
const tx = new Transaction()
const priceMist = BigInt(Math.ceil(listing.price * 1e9))
const [paymentCoin] = tx.splitCoins(tx.gas, [tx.pure.u64(priceMist)])
const adapter = ADAPTERS_MAP[listing.marketId]
adapter.buildBuyTx(tx, listing.itemTypeId, 'CHARACTER_ID', paymentCoin)
try {
const result = await dAppKit.signAndExecuteTransaction({ transaction: tx })
setStatus(`✅ Purchase successful! Tx: ${result.digest.slice(0, 12)}...`)
searchListings(listing.itemTypeId) // Refresh
} catch (e: any) {
setStatus(`❌ ${e.message}`)
}
}
return (
<div className="aggregated-market">
<header>
<h1>🛒 Cross-Market Aggregator</h1>
<p>Real-time compare prices across multiple Builder markets, buy at lowest price with one click</p>
{!isConnected && <button onClick={handleConnect}>Connect Wallet</button>}
</header>
{/* Item Selection */}
<div className="item-selector">
{ITEM_TYPES.map(item => (
<button
key={item.id}
className={`item-btn ${selectedItem === item.id ? 'selected' : ''}`}
onClick={() => searchListings(item.id)}
>
{item.name}
</button>
))}
</div>
{/* Price List */}
{loading && <div className="loading">🔍 Querying market prices...</div>}
{!loading && listings.length > 0 && (
<div className="listings">
<h3>
{ITEM_TYPES.find(i => i.id === selectedItem)?.name} — Price Comparison
<span className="badge">Lowest Price First</span>
</h3>
{listings.map((listing, i) => (
<div
key={listing.marketId}
className={`listing-row ${i === 0 ? 'best-price' : ''}`}
>
<span className="rank">#{i + 1}</span>
<span className="builder">{listing.builder}</span>
<span className="price">
{listing.price.toFixed(2)} SUI
{i === 0 && <span className="best-badge">Lowest</span>}
</span>
<button
className="buy-btn"
onClick={() => buyFromMarket(listing)}
disabled={!isConnected}
>
Buy Now
</button>
</div>
))}
</div>
)}
{!loading && listings.length === 0 && selectedItem && (
<div className="empty">No listings for this item in all markets</div>
)}
{status && <div className="status">{status}</div>}
</div>
)
}
🎯 Complete Review
Architecture Layers
├── Contract Layer: Each Builder publishes separately, different interfaces
│ ├── Alice: buy_item(market, char, item_id, coin)
│ ├── Bob: purchase(storage, char, type_id, payment, ctx)
│ └── You: buy_item_v2(market, char, id, coin, clock, ctx)
│
├── Adapter Layer (TypeScript, off-chain)
│ ├── MarketAdapter interface unified
│ ├── AliceMarketAdapter: Encapsulates Alice's read/write differences
│ ├── BobMarketAdapter: Encapsulates Bob's read/write differences
│ └── MyMarketAdapter: Encapsulates your own read/write
│
└── Aggregator dApp Layer
├── aggregatePrices(): Parallel read all markets
├── Sort and display
└── buyFromMarket(): Call corresponding adapter to build transaction
🔧 Extension Exercises
- On-Chain Adapter Registry: Maintain verified adapter list on-chain (prevent malicious Builders with fake prices to gain trust)
- Slippage Protection: Verify latest on-chain price before order, abort if change exceeds 5%
- Batch Purchase: Buy different items from multiple markets in one transaction
📚 Related Documentation
- Chapter 15: Cross-Contract Composability
- Chapter 9: Off-Chain Indexing
- Chapter 19: Full-Stack dApp Architecture
Practical Case 13: Subscription Pass (Monthly Unlimited Jumps)
Objective: Establish a subscription pass system—players pay a fixed SUI monthly for unlimited jumping rights in your alliance stargate network, no need to purchase tickets each time.
Status: Teaching example. The main text focuses on subscription model, complete directory is based on
book/src/code/example-13/.
Corresponding Code Directory
Minimal Call Chain
Select plan -> Pay subscription fee -> Mint/update GatePassNFT -> Stargate verifies pass validity
Requirements Analysis
Scenario: Your alliance controls 5 stargates and wants to establish a monthly membership system:
- Monthly Pass: 30 SUI/month, unlimited jumps through all stargates
- Quarterly Pass: 80 SUI/quarter, with discount
- After expiration, renewal required, otherwise downgrade to pay-per-use
- Subscription NFT is transferable (players can trade on secondary market)
Contract
module subscription::gate_pass;
use sui::object::{Self, UID, ID};
use sui::clock::Clock;
use sui::coin::{Self, Coin};
use sui::sui::SUI;
use sui::balance::{Self, Balance};
use sui::transfer;
use sui::event;
use std::string::String;
// ── Constants ──────────────────────────────────────────────────
const MONTH_MS: u64 = 30 * 24 * 60 * 60 * 1000;
/// Plan types
const PLAN_MONTHLY: u8 = 0;
const PLAN_QUARTERLY: u8 = 1;
// ── Data Structures ───────────────────────────────────────────────
/// Subscription manager (shared object)
public struct SubscriptionManager has key {
id: UID,
monthly_price: u64, // Monthly plan price (MIST)
quarterly_price: u64, // Quarterly plan price
revenue: Balance<SUI>,
admin: address,
total_subscribers: u64,
}
/// Subscription NFT (transferable, holding grants permission)
public struct GatePassNFT has key, store {
id: UID,
plan: u8,
valid_until_ms: u64,
subscriber: address, // Original subscriber
serial_number: u64,
}
// ── Events ──────────────────────────────────────────────────
public struct PassPurchased has copy, drop {
pass_id: ID,
buyer: address,
plan: u8,
valid_until_ms: u64,
}
public struct PassRenewed has copy, drop {
pass_id: ID,
new_expiry_ms: u64,
}
// ── Initialization ────────────────────────────────────────────────
fun init(ctx: &mut TxContext) {
transfer::share_object(SubscriptionManager {
id: object::new(ctx),
monthly_price: 30_000_000_000, // 30 SUI
quarterly_price: 80_000_000_000, // 80 SUI (10 SUI cheaper than 3 months)
revenue: balance::zero(),
admin: ctx.sender(),
total_subscribers: 0,
});
}
// ── Purchase Subscription ──────────────────────────────────────────────
public fun purchase_pass(
mgr: &mut SubscriptionManager,
plan: u8,
mut payment: Coin<SUI>,
clock: &Clock,
ctx: &mut TxContext,
) {
let (price, duration_ms) = if plan == PLAN_MONTHLY {
(mgr.monthly_price, MONTH_MS)
} else if plan == PLAN_QUARTERLY {
(mgr.quarterly_price, 3 * MONTH_MS)
} else abort EInvalidPlan;
assert!(coin::value(&payment) >= price, EInsufficientPayment);
let pay = payment.split(price, ctx);
balance::join(&mut mgr.revenue, coin::into_balance(pay));
if coin::value(&payment) > 0 {
transfer::public_transfer(payment, ctx.sender());
} else { coin::destroy_zero(payment); }
mgr.total_subscribers = mgr.total_subscribers + 1;
let valid_until_ms = clock.timestamp_ms() + duration_ms;
let pass = GatePassNFT {
id: object::new(ctx),
plan,
valid_until_ms,
subscriber: ctx.sender(),
serial_number: mgr.total_subscribers,
};
let pass_id = object::id(&pass);
transfer::public_transfer(pass, ctx.sender());
event::emit(PassPurchased {
pass_id,
buyer: ctx.sender(),
plan,
valid_until_ms,
});
}
/// Renew (extend validity period of existing Pass)
public fun renew_pass(
mgr: &mut SubscriptionManager,
pass: &mut GatePassNFT,
mut payment: Coin<SUI>,
clock: &Clock,
ctx: &mut TxContext,
) {
let (price, duration_ms) = if pass.plan == PLAN_MONTHLY {
(mgr.monthly_price, MONTH_MS)
} else {
(mgr.quarterly_price, 3 * MONTH_MS)
};
assert!(coin::value(&payment) >= price, EInsufficientPayment);
let pay = payment.split(price, ctx);
balance::join(&mut mgr.revenue, coin::into_balance(pay));
if coin::value(&payment) > 0 {
transfer::public_transfer(payment, ctx.sender());
} else { coin::destroy_zero(payment); }
// If already expired, start from now, otherwise stack on original expiry time
let base = if pass.valid_until_ms < clock.timestamp_ms() {
clock.timestamp_ms()
} else { pass.valid_until_ms };
pass.valid_until_ms = base + duration_ms;
event::emit(PassRenewed {
pass_id: object::id(pass),
new_expiry_ms: pass.valid_until_ms,
});
}
/// Stargate extension: verify Pass validity
public fun is_pass_valid(pass: &GatePassNFT, clock: &Clock): bool {
clock.timestamp_ms() <= pass.valid_until_ms
}
/// Stargate jump (unlimited jumps with valid Pass)
public fun subscriber_jump(
gate: &Gate,
dest_gate: &Gate,
character: &Character,
pass: &GatePassNFT,
clock: &Clock,
ctx: &mut TxContext,
) {
assert!(is_pass_valid(pass, clock), EPassExpired);
gate::issue_jump_permit(
gate, dest_gate, character, SubscriberAuth {},
clock.timestamp_ms() + 30 * 60 * 1000, ctx,
);
}
public struct SubscriberAuth has drop {}
/// Admin withdraw revenue
public fun withdraw_revenue(
mgr: &mut SubscriptionManager,
amount: u64,
ctx: &TxContext,
) {
assert!(ctx.sender() == mgr.admin, ENotAdmin);
let coin = coin::take(&mut mgr.revenue, amount, ctx);
transfer::public_transfer(coin, mgr.admin);
}
const EInvalidPlan: u64 = 0;
const EInsufficientPayment: u64 = 1;
const EPassExpired: u64 = 2;
const ENotAdmin: u64 = 3;
dApp
// PassShop.tsx
import { useState } from 'react'
import { Transaction } from '@mysten/sui/transactions'
import { useDAppKit } from '@mysten/dapp-kit-react'
const SUB_PKG = "0x_SUBSCRIPTION_PACKAGE_"
const MGR_ID = "0x_MANAGER_ID_"
const PLANS = [
{ id: 0, name: 'Monthly Plan', price: 30, duration: '30 days', badge: 'Standard' },
{ id: 1, name: 'Quarterly Plan', price: 80, duration: '90 days', badge: 'Save 10 SUI', popular: true },
]
export function PassShop() {
const dAppKit = useDAppKit()
const [status, setStatus] = useState('')
const purchase = async (plan: number, priceInSUI: number) => {
const tx = new Transaction()
const [payment] = tx.splitCoins(tx.gas, [tx.pure.u64(priceInSUI * 1e9)])
tx.moveCall({
target: `${SUB_PKG}::gate_pass::purchase_pass`,
arguments: [tx.object(MGR_ID), tx.pure.u8(plan), payment, tx.object('0x6')],
})
try {
setStatus('Purchasing...')
await dAppKit.signAndExecuteTransaction({ transaction: tx })
setStatus('Subscription successful! GatePassNFT sent to your wallet')
} catch (e: any) { setStatus(`${e.message}`) }
}
return (
<div className="pass-shop">
<h1>Stargate Subscription Pass</h1>
<p>Unlimited jumps through all alliance stargates with valid pass</p>
<div className="plan-grid">
{PLANS.map(plan => (
<div key={plan.id} className={`plan-card ${plan.popular ? 'popular' : ''}`}>
{plan.popular && <div className="popular-badge">Recommended</div>}
<h3>{plan.name}</h3>
<div className="plan-price">
<span className="price">{plan.price}</span>
<span className="unit">SUI</span>
</div>
<div className="plan-duration">Valid for {plan.duration}</div>
<div className="plan-badge">{plan.badge}</div>
<button className="buy-btn" onClick={() => purchase(plan.id, plan.price)}>
Purchase {plan.name}
</button>
</div>
))}
</div>
{status && <p className="status">{status}</p>}
</div>
)
}
Related Documentation
Practical Case 14: NFT Collateral Lending Protocol
Objective: Build an on-chain lending protocol—players use NFTs or high-value items as collateral to borrow SUI liquidity; collateral is liquidated and auctioned to the highest bidder if loan is not repaid on time.
Status: Teaching example. The main text covers core lending process, complete directory is based on
book/src/code/example-14/.
Corresponding Code Directory
Minimal Call Chain
Lender injects liquidity -> Borrower collateralizes NFT -> Contract issues SUI -> Repayment on time or liquidation triggered
Requirements Analysis
Scenario: Player holds a “rare shield” worth 1000 SUI but urgently needs SUI to buy mining machines. They pledge the shield, borrow 600 SUI (60% LTV), and must return 618 SUI within 30 days (including 3% monthly interest), otherwise the shield is liquidated.
Contract
module lending::collateral_loan;
use sui::object::{Self, UID, ID};
use sui::clock::Clock;
use sui::coin::{Self, Coin};
use sui::sui::SUI;
use sui::balance::{Self, Balance};
use sui::transfer;
use sui::dynamic_field as df;
use sui::event;
// ── Constants ──────────────────────────────────────────────────
const MONTH_MS: u64 = 30 * 24 * 60 * 60 * 1000;
const LTV_BPS: u64 = 6_000; // 60% loan-to-value
const MONTHLY_INTEREST_BPS: u64 = 300; // 3% monthly interest
const LIQUIDATION_BONUS_BPS: u64 = 500; // 5% liquidator reward
// ── Data Structures ───────────────────────────────────────────────
/// Lending pool (shared object, stores lender's SUI)
public struct LendingPool has key {
id: UID,
liquidity: Balance<SUI>,
total_loaned: u64,
admin: address,
}
/// Single loan
public struct Loan has key {
id: UID,
borrower: address,
collateral_id: ID, // Collateral ObjectID
collateral_value: u64, // Evaluated value at lending time (SUI)
loan_amount: u64, // Actual borrowed amount
interest_amount: u64, // Interest due
repay_by_ms: u64, // Repayment deadline
is_active: bool,
}
// ── Events ──────────────────────────────────────────────────
public struct LoanCreated has copy, drop {
loan_id: ID,
borrower: address,
loan_amount: u64,
repay_by_ms: u64,
}
public struct LoanRepaid has copy, drop {
loan_id: ID,
repaid: u64,
}
public struct LoanLiquidated has copy, drop {
loan_id: ID,
liquidator: address,
collateral_id: ID,
}
// ── Initialize Lending Pool ──────────────────────────────────────────
public fun create_pool(ctx: &mut TxContext) {
transfer::share_object(LendingPool {
id: object::new(ctx),
liquidity: balance::zero(),
total_loaned: 0,
admin: ctx.sender(),
});
}
/// Lender deposits liquidity into pool
public fun deposit_liquidity(
pool: &mut LendingPool,
coin: Coin<SUI>,
_ctx: &TxContext,
) {
balance::join(&mut pool.liquidity, coin::into_balance(coin));
}
// ── Borrow (with NFT as collateral) ────────────────────────────────
/// Created by Oracle/Admin with evaluation
/// (In real scenarios, collateral_value needs to be determined by off-chain price oracle)
public fun create_loan<T: key + store>(
pool: &mut LendingPool,
collateral: T,
collateral_value_sui: u64, // Valuation confirmed by price oracle or Admin
clock: &Clock,
ctx: &mut TxContext,
) {
let loan_amount = collateral_value_sui * LTV_BPS / 10_000; // 60% LTV
let interest = loan_amount * MONTHLY_INTEREST_BPS / 10_000;
assert!(balance::value(&pool.liquidity) >= loan_amount, EInsufficientLiquidity);
let collateral_id = object::id(&collateral);
let mut loan = Loan {
id: object::new(ctx),
borrower: ctx.sender(),
collateral_id,
collateral_value: collateral_value_sui,
loan_amount,
interest_amount: interest,
repay_by_ms: clock.timestamp_ms() + MONTH_MS,
is_active: true,
};
// Lock collateral in Loan object (dynamic field)
df::add(&mut loan.id, b"collateral", collateral);
// Issue loan
let loan_coin = coin::take(&mut pool.liquidity, loan_amount, ctx);
pool.total_loaned = pool.total_loaned + loan_amount;
transfer::public_transfer(loan_coin, ctx.sender());
event::emit(LoanCreated {
loan_id: object::id(&loan),
borrower: ctx.sender(),
loan_amount,
repay_by_ms: loan.repay_by_ms,
});
transfer::share_object(loan);
}
// ── Repayment (return loan + interest, retrieve collateral) ──────────────────
public fun repay_loan<T: key + store>(
pool: &mut LendingPool,
loan: &mut Loan,
mut repayment: Coin<SUI>,
ctx: &mut TxContext,
) {
assert!(loan.borrower == ctx.sender(), ENotBorrower);
assert!(loan.is_active, ELoanInactive);
let total_due = loan.loan_amount + loan.interest_amount;
assert!(coin::value(&repayment) >= total_due, EInsufficientRepayment);
// Repayment to pool
let repay_coin = repayment.split(total_due, ctx);
balance::join(&mut pool.liquidity, coin::into_balance(repay_coin));
pool.total_loaned = pool.total_loaned - loan.loan_amount;
if coin::value(&repayment) > 0 {
transfer::public_transfer(repayment, ctx.sender());
} else { coin::destroy_zero(repayment); }
// Retrieve collateral
let collateral: T = df::remove(&mut loan.id, b"collateral");
transfer::public_transfer(collateral, ctx.sender());
loan.is_active = false;
event::emit(LoanRepaid {
loan_id: object::id(loan),
repaid: total_due,
});
}
// ── Liquidation (overdue, liquidator takes collateral) ──────────────────
public fun liquidate<T: key + store>(
pool: &mut LendingPool,
loan: &mut Loan,
mut liquidation_payment: Coin<SUI>, // Liquidator pays collateral_value * 95%
clock: &Clock,
ctx: &mut TxContext,
) {
assert!(loan.is_active, ELoanInactive);
assert!(clock.timestamp_ms() > loan.repay_by_ms, ENotYetExpired);
// Liquidator must pay 95% of collateral valuation (5% as reward)
let liquidation_price = loan.collateral_value * (10_000 - LIQUIDATION_BONUS_BPS) / 10_000;
assert!(coin::value(&liquidation_payment) >= liquidation_price, EInsufficientPayment);
let pay = liquidation_payment.split(liquidation_price, ctx);
// Repay principal + interest to pool, remainder to liquidator as reward
balance::join(&mut pool.liquidity, coin::into_balance(pay));
if coin::value(&liquidation_payment) > 0 {
transfer::public_transfer(liquidation_payment, ctx.sender());
} else { coin::destroy_zero(liquidation_payment); }
// Liquidator receives collateral
let collateral: T = df::remove(&mut loan.id, b"collateral");
transfer::public_transfer(collateral, ctx.sender());
loan.is_active = false;
event::emit(LoanLiquidated {
loan_id: object::id(loan),
liquidator: ctx.sender(),
collateral_id: loan.collateral_id,
});
}
const EInsufficientLiquidity: u64 = 0;
const ENotBorrower: u64 = 1;
const ELoanInactive: u64 = 2;
const EInsufficientRepayment: u64 = 3;
const ENotYetExpired: u64 = 4;
const EInsufficientPayment: u64 = 5;
dApp Interface (Lending Dashboard)
// LendingDashboard.tsx
import { useQuery } from '@tanstack/react-query'
import { useCurrentClient } from '@mysten/dapp-kit-react'
const LENDING_PKG = "0x_LENDING_PACKAGE_"
const POOL_ID = "0x_POOL_ID_"
export function LendingDashboard() {
const client = useCurrentClient()
const { data: pool } = useQuery({
queryKey: ['lending-pool'],
queryFn: async () => {
const obj = await client.getObject({ id: POOL_ID, options: { showContent: true } })
return (obj.data?.content as any)?.fields
},
refetchInterval: 15_000,
})
const availableLiquidity = Number(pool?.liquidity?.fields?.value ?? 0) / 1e9
const totalLoaned = Number(pool?.total_loaned ?? 0) / 1e9
const utilization = totalLoaned / (availableLiquidity + totalLoaned) * 100
return (
<div className="lending-dashboard">
<h1>NFT Collateral Lending</h1>
<div className="pool-stats">
<div className="stat">
<span>Available Liquidity</span>
<strong>{availableLiquidity.toFixed(2)} SUI</strong>
</div>
<div className="stat">
<span>Total Loaned</span>
<strong>{totalLoaned.toFixed(2)} SUI</strong>
</div>
<div className="stat">
<span>Utilization Rate</span>
<strong>{utilization.toFixed(1)}%</strong>
</div>
<div className="stat">
<span>Monthly Interest</span>
<strong>3%</strong>
</div>
</div>
<div className="loan-info">
<h3>Loan Terms</h3>
<ul>
<li>Loan-to-Value (LTV): 60%</li>
<li>Monthly Interest: 3% (fixed)</li>
<li>Maximum Term: 30 days</li>
<li>Overdue Liquidation: Collateral acquired by liquidator at 95% valuation</li>
</ul>
</div>
</div>
)
}
Related Documentation
Practical Case 16: NFT Crafting and Disassembly System
Objective: Build a material crafting system—destroy multiple low-level NFTs to craft one high-level NFT (probabilistic), also support disassembling high-level NFTs into materials; use on-chain randomness to ensure fair results.
Status: Teaching example. The main text explains crafting/disassembly and random number integration, complete directory is based on
book/src/code/example-16/.
Corresponding Code Directory
Minimal Call Chain
User selects materials -> Contract reads random number -> Execute craft/fail return -> Emit event -> Frontend refreshes results
Requirements Analysis
Scenario: You’ve designed a three-tier equipment system:
- Material Fragment: Common, random drops
- Refined Component: 3 fragments → 60% chance to craft
- Ancient Artifact: 3 refined components → 30% chance to craft, returns 1 component on failure
Contract
module crafting::forge;
use sui::object::{Self, UID, ID};
use sui::random::{Self, Random};
use sui::transfer;
use sui::event;
use std::string::{Self, String, utf8};
// ── Constants ──────────────────────────────────────────────────
const TIER_FRAGMENT: u8 = 0;
const TIER_COMPONENT: u8 = 1;
const TIER_ARTIFACT: u8 = 2;
// Crafting success rates (BPS)
const FRAGMENT_TO_COMPONENT_BPS: u64 = 6_000; // 60%
const COMPONENT_TO_ARTIFACT_BPS: u64 = 3_000; // 30%
// ── Data Structures ───────────────────────────────────────────────
public struct ForgeItem has key, store {
id: UID,
tier: u8,
name: String,
image_url: String,
power: u64, // Attribute value (higher tier = stronger)
}
public struct ForgeAdminCap has key, store { id: UID }
// ── Events ──────────────────────────────────────────────────
public struct CraftAttempted has copy, drop {
crafter: address,
input_tier: u8,
success: bool,
result_tier: u8,
}
public struct ItemDisassembled has copy, drop {
crafter: address,
from_tier: u8,
fragments_returned: u64,
}
// ── Initialization ────────────────────────────────────────────────
fun init(ctx: &mut TxContext) {
transfer::public_transfer(ForgeAdminCap { id: object::new(ctx) }, ctx.sender());
}
/// Mint base fragment (Admin only, e.g., quest reward)
public fun mint_fragment(
_cap: &ForgeAdminCap,
recipient: address,
ctx: &mut TxContext,
) {
let item = ForgeItem {
id: object::new(ctx),
tier: TIER_FRAGMENT,
name: utf8(b"Plasma Fragment"),
image_url: utf8(b"https://assets.example.com/fragment.png"),
power: 10,
};
transfer::public_transfer(item, recipient);
}
// ── Crafting: 3 low-level → 1 high-level (with random success rate) ────────────
public fun craft(
input1: ForgeItem,
input2: ForgeItem,
input3: ForgeItem,
random: &Random,
ctx: &mut TxContext,
) {
// Three inputs must be same tier
assert!(input1.tier == input2.tier && input2.tier == input3.tier, EMismatchedTier);
let input_tier = input1.tier;
assert!(input_tier < TIER_ARTIFACT, EMaxTierReached);
let target_tier = input_tier + 1;
// Get on-chain random number (0-9999)
let mut rng = random::new_generator(random, ctx);
let roll = rng.generate_u64() % 10_000;
let success_threshold = if target_tier == TIER_COMPONENT {
FRAGMENT_TO_COMPONENT_BPS
} else {
COMPONENT_TO_ARTIFACT_BPS
};
// Destroy all three inputs regardless of success
let ForgeItem { id: id1, .. } = input1;
let ForgeItem { id: id2, .. } = input2;
let ForgeItem { id: id3, .. } = input3;
id1.delete(); id2.delete(); id3.delete();
let success = roll < success_threshold;
if success {
let (name, image_url, power) = get_tier_info(target_tier);
let result = ForgeItem {
id: object::new(ctx),
tier: target_tier,
name,
image_url,
power,
};
transfer::public_transfer(result, ctx.sender());
} else if target_tier == TIER_ARTIFACT {
// Consolation prize on artifact craft failure: return 1 refined component
let (name, image_url, power) = get_tier_info(TIER_COMPONENT);
let consolation = ForgeItem {
id: object::new(ctx),
tier: TIER_COMPONENT,
name,
image_url,
power,
};
transfer::public_transfer(consolation, ctx.sender());
};
// No return on component craft failure (60% success rate, risk is player's)
event::emit(CraftAttempted {
crafter: ctx.sender(),
input_tier,
success,
result_tier: if success { target_tier } else { input_tier },
});
}
// ── Disassembly: 1 high-level → multiple low-level ────────────────────────────
public fun disassemble(
item: ForgeItem,
ctx: &mut TxContext,
) {
assert!(item.tier > TIER_FRAGMENT, ECannotDisassembleFragment);
let target_tier = item.tier - 1;
let fragments_to_return = 2u64; // Disassembly only returns 2 (lossy)
let item_tier = item.tier;
let ForgeItem { id, .. } = item;
id.delete();
let (name, image_url, power) = get_tier_info(target_tier);
let mut i = 0;
while (i < fragments_to_return) {
let fragment = ForgeItem {
id: object::new(ctx),
tier: target_tier,
name,
image_url,
power,
};
transfer::public_transfer(fragment, ctx.sender());
i = i + 1;
};
event::emit(ItemDisassembled {
crafter: ctx.sender(),
from_tier: item_tier,
fragments_returned: fragments_to_return,
});
}
fun get_tier_info(tier: u8): (String, String, u64) {
if tier == TIER_FRAGMENT {
(utf8(b"Plasma Fragment"), utf8(b"https://assets.example.com/fragment.png"), 10)
} else if tier == TIER_COMPONENT {
(utf8(b"Refined Component"), utf8(b"https://assets.example.com/component.png"), 100)
} else {
(utf8(b"Ancient Artifact"), utf8(b"https://assets.example.com/artifact.png"), 1000)
}
}
const EMismatchedTier: u64 = 0;
const EMaxTierReached: u64 = 1;
const ECannotDisassembleFragment: u64 = 2;
dApp (Forging Station Interface)
// ForgingStation.tsx
import { useState } from 'react'
import { useCurrentClient, useCurrentAccount } from '@mysten/dapp-kit-react'
import { useQuery } from '@tanstack/react-query'
import { Transaction } from '@mysten/sui/transactions'
import { useDAppKit } from '@mysten/dapp-kit-react'
const CRAFTING_PKG = "0x_CRAFTING_PACKAGE_"
const TIER_NAMES = ['Fragment', 'Refined Component', 'Ancient Artifact']
const CRAFT_RATES = ['60%', '30%', '—']
export function ForgingStation() {
const client = useCurrentClient()
const dAppKit = useDAppKit()
const account = useCurrentAccount()
const [selected, setSelected] = useState<string[]>([])
const [status, setStatus] = useState('')
const [lastCraft, setLastCraft] = useState<{success: boolean; tier: string} | null>(null)
const { data: userItems, refetch } = useQuery({
queryKey: ['forge-items', account?.address],
queryFn: async () => {
if (!account) return []
const objs = await client.getOwnedObjects({
owner: account.address,
filter: { StructType: `${CRAFTING_PKG}::forge::ForgeItem` },
options: { showContent: true },
})
return objs.data.map(obj => ({
id: obj.data!.objectId,
tier: Number((obj.data!.content as any).fields.tier),
name: (obj.data!.content as any).fields.name,
power: (obj.data!.content as any).fields.power,
}))
},
enabled: !!account,
})
const toggleSelect = (id: string) => {
setSelected(prev =>
prev.includes(id) ? prev.filter(i => i !== id) : prev.length < 3 ? [...prev, id] : prev
)
}
const handleCraft = async () => {
if (selected.length !== 3) return
const tx = new Transaction()
tx.moveCall({
target: `${CRAFTING_PKG}::forge::craft`,
arguments: [
tx.object(selected[0]),
tx.object(selected[1]),
tx.object(selected[2]),
tx.object('0x8'), // Random system object
],
})
try {
setStatus('Crafting (on-chain random determination)...')
const result = await dAppKit.signAndExecuteTransaction({ transaction: tx })
// Read craft result from event
const craftEvent = result.events?.find(e => e.type.includes('CraftAttempted'))
if (craftEvent) {
const { success, result_tier } = craftEvent.parsedJson as any
setLastCraft({ success, tier: TIER_NAMES[Number(result_tier)] })
setStatus(success ? `Craft successful! Obtained ${TIER_NAMES[Number(result_tier)]}` : 'Craft failed')
}
setSelected([])
refetch()
} catch (e: any) { setStatus(`${e.message}`) }
}
const selectedTier = selected.length > 0 && userItems
? userItems.find(i => i.id === selected[0])?.tier
: null
return (
<div className="forging-station">
<h1>Mysterious Forge</h1>
{lastCraft && (
<div className={`craft-result ${lastCraft.success ? 'success' : 'fail'}`}>
{lastCraft.success ? 'Craft Successful!' : 'Craft Failed'} → {lastCraft.tier}
</div>
)}
<div className="craft-info">
<div>Fragment × 3 → Refined Component (success rate {CRAFT_RATES[0]})</div>
<div>Refined Component × 3 → Ancient Artifact (success rate {CRAFT_RATES[1]})</div>
</div>
<h3>Select 3 same-tier items to craft</h3>
<div className="items-grid">
{userItems?.map(item => (
<div
key={item.id}
className={`item-slot ${selected.includes(item.id) ? 'selected' : ''}`}
onClick={() => toggleSelect(item.id)}
>
<div className="tier-badge">{TIER_NAMES[item.tier]}</div>
<div className="power">Power: {item.power}</div>
</div>
))}
</div>
<button
className="craft-btn"
disabled={selected.length !== 3}
onClick={handleCraft}
>
Start Crafting ({selected.length}/3 selected)
</button>
{status && <p className="status">{status}</p>}
</div>
)
}
Related Documentation
Practical Case 18: Inter-Alliance Diplomatic Treaty (Ceasefire and Resource Treaties)
Objective: Build an on-chain diplomatic contract—two alliances can sign treaties (ceasefire, resource sharing, trade agreements), treaties take effect when co-signed by both Leaders, violations can be proven on-chain, and enforcement is mandatory during validity period.
Status: Teaching example. The main text covers treaty state machine, complete directory is based on
book/src/code/example-18/.
Corresponding Code Directory
Minimal Call Chain
One party initiates proposal -> Both parties deposit and sign -> Treaty takes effect -> Violation/termination occurs -> Penalty deduction or deposit refund
Test Loop
- Initiate proposal: Confirm
TreatyProposalcreated successfully with event emitted - Co-signing takes effect: Confirm
effective_at_mswritten, both deposits equal - Advance notice and termination: Confirm cannot terminate before notice matures, deposits refunded after maturity
- Report violation: Confirm penalty deducted from violating party’s deposit and transferred to other party
Requirements Analysis
Scenario: Alliance Alpha and Alliance Beta erupted in conflict, both sides decide to negotiate:
- Ceasefire Agreement: Both alliance turrets do not fire on opposing members for 72 hours
- Passage Agreement: Alpha members can use Beta’s stargates for free (and vice versa)
- Resource Sharing: Both sides transfer 100 WAR Token to each other daily
- Either party can unilaterally terminate treaty (requires 24-hour advance notice on-chain)
- Violations (such as illegal turret firing) can be reported via server signature, penalty deposit confiscated
Contract
module diplomacy::treaty;
use sui::object::{Self, UID, ID};
use sui::clock::Clock;
use sui::coin::{Self, Coin};
use sui::sui::SUI;
use sui::balance::{Self, Balance};
use sui::transfer;
use sui::event;
use std::string::{Self, String, utf8};
// ── Constants ──────────────────────────────────────────────────
const NOTICE_PERIOD_MS: u64 = 24 * 60 * 60 * 1000; // 24-hour advance termination notice
const BREACH_FINE: u64 = 100_000_000_000; // 100 SUI violation fine (from deposit)
// Treaty types
const TREATY_CEASEFIRE: u8 = 0; // Ceasefire agreement
const TREATY_PASSAGE: u8 = 1; // Passage rights agreement
const TREATY_RESOURCE_SHARE: u8 = 2; // Resource sharing
// ── Data Structures ───────────────────────────────────────────────
/// Diplomatic treaty (shared object)
public struct Treaty has key {
id: UID,
treaty_type: u8,
party_a: address, // Alliance A's Leader address
party_b: address, // Alliance B's Leader address
party_a_signed: bool,
party_b_signed: bool,
effective_at_ms: u64, // Effective time (after co-signing)
expires_at_ms: u64, // Expiration time (0 = indefinite)
termination_notice_ms: u64, // Termination notice time (0 = not notified)
party_a_deposit: Balance<SUI>, // Party A deposit (for violation compensation)
party_b_deposit: Balance<SUI>, // Party B deposit
breach_count_a: u64,
breach_count_b: u64,
description: String,
}
/// Treaty proposal (initiated by one party, awaiting other party's signature)
public struct TreatyProposal has key {
id: UID,
proposed_by: address,
counterparty: address,
treaty_type: u8,
duration_days: u64, // Duration (days), 0 = indefinite
deposit_required: u64, // Required deposit from each party
description: String,
}
// ── Events ──────────────────────────────────────────────────
public struct TreatyProposed has copy, drop { proposal_id: ID, proposer: address, counterparty: address }
public struct TreatySigned has copy, drop { treaty_id: ID, party: address }
public struct TreatyEffective has copy, drop { treaty_id: ID, treaty_type: u8 }
public struct TreatyTerminated has copy, drop { treaty_id: ID, terminated_by: address }
public struct BreachReported has copy, drop { treaty_id: ID, breaching_party: address, fine: u64 }
// ── Initiate Treaty Proposal ──────────────────────────────────────────
public fun propose_treaty(
counterparty: address,
treaty_type: u8,
duration_days: u64,
deposit_required: u64,
description: vector<u8>,
ctx: &mut TxContext,
) {
let proposal = TreatyProposal {
id: object::new(ctx),
proposed_by: ctx.sender(),
counterparty,
treaty_type,
duration_days,
deposit_required,
description: utf8(description),
};
let proposal_id = object::id(&proposal);
transfer::share_object(proposal);
event::emit(TreatyProposed {
proposal_id,
proposer: ctx.sender(),
counterparty,
});
}
// ── Accept Proposal (Proposer signs + deposit) ────────────────────────
public fun accept_and_sign_a(
proposal: &TreatyProposal,
mut deposit: Coin<SUI>,
clock: &Clock,
ctx: &mut TxContext,
) {
assert!(ctx.sender() == proposal.proposed_by, ENotParty);
let deposit_amt = coin::value(&deposit);
assert!(deposit_amt >= proposal.deposit_required, EInsufficientDeposit);
let deposit_coin = deposit.split(proposal.deposit_required, ctx);
if coin::value(&deposit) > 0 {
transfer::public_transfer(deposit, ctx.sender());
} else { coin::destroy_zero(deposit); }
let expires = if proposal.duration_days > 0 {
clock.timestamp_ms() + proposal.duration_days * 86_400_000
} else { 0 };
let treaty = Treaty {
id: object::new(ctx),
treaty_type: proposal.treaty_type,
party_a: proposal.proposed_by,
party_b: proposal.counterparty,
party_a_signed: true,
party_b_signed: false,
effective_at_ms: 0,
expires_at_ms: expires,
termination_notice_ms: 0,
party_a_deposit: coin::into_balance(deposit_coin),
party_b_deposit: balance::zero(),
breach_count_a: 0,
breach_count_b: 0,
description: proposal.description,
};
let treaty_id = object::id(&treaty);
transfer::share_object(treaty);
event::emit(TreatySigned { treaty_id, party: ctx.sender() });
}
/// Counterparty alliance signs (treaty officially takes effect)
public fun countersign(
treaty: &mut Treaty,
mut deposit: Coin<SUI>,
clock: &Clock,
ctx: &mut TxContext,
) {
assert!(ctx.sender() == treaty.party_b, ENotParty);
assert!(treaty.party_a_signed, ENotYetSigned);
assert!(!treaty.party_b_signed, EAlreadySigned);
let required = balance::value(&treaty.party_a_deposit); // Equal deposit
assert!(coin::value(&deposit) >= required, EInsufficientDeposit);
let dep = deposit.split(required, ctx);
balance::join(&mut treaty.party_b_deposit, coin::into_balance(dep));
if coin::value(&deposit) > 0 {
transfer::public_transfer(deposit, ctx.sender());
} else { coin::destroy_zero(deposit); }
treaty.party_b_signed = true;
treaty.effective_at_ms = clock.timestamp_ms();
event::emit(TreatyEffective { treaty_id: object::id(treaty), treaty_type: treaty.treaty_type });
event::emit(TreatySigned { treaty_id: object::id(treaty), party: ctx.sender() });
}
// ── Verify treaty is in effect (called by turret/stargate extensions) ───────────────
public fun is_treaty_active(treaty: &Treaty, clock: &Clock): bool {
if !treaty.party_a_signed || !treaty.party_b_signed { return false };
if treaty.expires_at_ms > 0 && clock.timestamp_ms() > treaty.expires_at_ms { return false };
// Treaty still valid during termination notice period
true
}
/// Check if address is protected by treaty
public fun is_protected_by_treaty(
treaty: &Treaty,
protected_member: address, // Protected alliance member (verified through FactionNFT.owner or member list)
aggressor_faction: address,
clock: &Clock,
): bool {
is_treaty_active(treaty, clock)
// Real scenario requires additional verification of member-alliance association
}
// ── Submit Termination Notice (takes effect after 24 hours) ───────────────────────
public fun give_termination_notice(
treaty: &mut Treaty,
clock: &Clock,
ctx: &TxContext,
) {
assert!(ctx.sender() == treaty.party_a || ctx.sender() == treaty.party_b, ENotParty);
assert!(is_treaty_active(treaty, clock), ETreatyNotActive);
treaty.termination_notice_ms = clock.timestamp_ms();
event::emit(TreatyTerminated { treaty_id: object::id(treaty), terminated_by: ctx.sender() });
}
/// Officially terminate treaty after notice period matures, both parties retrieve deposits
public fun finalize_termination(
treaty: &mut Treaty,
clock: &Clock,
ctx: &mut TxContext,
) {
assert!(treaty.termination_notice_ms > 0, ENoNoticeGiven);
assert!(
clock.timestamp_ms() >= treaty.termination_notice_ms + NOTICE_PERIOD_MS,
ENoticeNotMature,
);
// Refund deposits
let a_dep = balance::withdraw_all(&mut treaty.party_a_deposit);
let b_dep = balance::withdraw_all(&mut treaty.party_b_deposit);
if balance::value(&a_dep) > 0 {
transfer::public_transfer(coin::from_balance(a_dep, ctx), treaty.party_a);
} else { balance::destroy_zero(a_dep); }
if balance::value(&b_dep) > 0 {
transfer::public_transfer(coin::from_balance(b_dep, ctx), treaty.party_b);
} else { balance::destroy_zero(b_dep); }
}
// ── Report Violation (verified and signed by game server) ──────────────────
public fun report_breach(
treaty: &mut Treaty,
breaching_party: address, // Violating alliance's Leader address
admin_acl: &AdminACL,
ctx: &mut TxContext,
) {
verify_sponsor(admin_acl, ctx); // Server proves violation event actually occurred
let fine = BREACH_FINE;
if breaching_party == treaty.party_a {
treaty.breach_count_a = treaty.breach_count_a + 1;
// Deduct fine from A's deposit and transfer to B
if balance::value(&treaty.party_a_deposit) >= fine {
let fine_coin = coin::take(&mut treaty.party_a_deposit, fine, ctx);
transfer::public_transfer(fine_coin, treaty.party_b);
}
} else if breaching_party == treaty.party_b {
treaty.breach_count_b = treaty.breach_count_b + 1;
if balance::value(&treaty.party_b_deposit) >= fine {
let fine_coin = coin::take(&mut treaty.party_b_deposit, fine, ctx);
transfer::public_transfer(fine_coin, treaty.party_a);
}
} else abort ENotParty;
event::emit(BreachReported {
treaty_id: object::id(treaty),
breaching_party,
fine,
});
}
const ENotParty: u64 = 0;
const EInsufficientDeposit: u64 = 1;
const ENotYetSigned: u64 = 2;
const EAlreadySigned: u64 = 3;
const ETreatyNotActive: u64 = 4;
const ENoNoticeGiven: u64 = 5;
const ENoticeNotMature: u64 = 6;
dApp (Diplomacy Center)
// DiplomacyCenter.tsx
import { useState } from 'react'
import { useCurrentClient } from '@mysten/dapp-kit-react'
import { useQuery } from '@tanstack/react-query'
const DIP_PKG = "0x_DIPLOMACY_PACKAGE_"
const TREATY_TYPES = [
{ id: 0, name: 'Ceasefire Agreement', desc: 'Both parties must not attack during validity period' },
{ id: 1, name: 'Passage Rights Agreement', desc: 'Members can use opposing stargates for free' },
{ id: 2, name: 'Resource Sharing Agreement', desc: 'Periodic mutual resource transfer' },
]
export function DiplomacyCenter() {
const client = useCurrentClient()
const [proposing, setProposing] = useState(false)
const { data: treaties } = useQuery({
queryKey: ['active-treaties'],
queryFn: async () => {
const events = await client.queryEvents({
query: { MoveEventType: `${DIP_PKG}::treaty::TreatyEffective` },
limit: 20,
})
return events.data
},
refetchInterval: 30_000,
})
return (
<div className="diplomacy-center">
<header>
<h1>Inter-Alliance Diplomacy Center</h1>
<p>Sign legally binding alliance treaties on-chain</p>
</header>
<section className="treaty-types">
<h3>Available Treaty Types</h3>
<div className="types-grid">
{TREATY_TYPES.map(t => (
<div key={t.id} className="type-card">
<h4>{t.name}</h4>
<p>{t.desc}</p>
</div>
))}
</div>
</section>
<section className="active-treaties">
<h3>Currently Active Treaties</h3>
{treaties?.length === 0 && <p>No treaties</p>}
{treaties?.map(e => {
const { treaty_id, treaty_type } = e.parsedJson as any
const type = TREATY_TYPES[Number(treaty_type)]
return (
<div key={treaty_id} className="treaty-card">
<span className="treaty-type">{type?.name}</span>
<span className="treaty-id">{treaty_id.slice(0, 12)}...</span>
<span className="treaty-status active">Active</span>
</div>
)
})}
</section>
<button className="propose-btn" onClick={() => setProposing(true)}>
Propose New Treaty
</button>
</div>
)
}
Key Design Highlights
| Mechanism | Implementation |
|---|---|
| Co-signing takes effect | Both party_a_signed + party_b_signed must be true to take effect |
| Deposit constraint | Both disputing parties deposit, violations automatically penalized |
| Termination notice | termination_notice_ms + 24-hour cooling period |
| Violation proof | Game server AdminACL signature proof, auto-execute penalty |
| Treaty verification | is_treaty_active() for turret/stargate extension calls |
Related Documentation
- Chapter 8: AdminACL and Server Verification
- Chapter 11: Ownership and OwnerCap
- Example 12: Alliance Recruitment
Chapter 18: Multi-Tenant Architecture and Game Server Integration
Goal: Understand EVE Frontier’s multi-tenant world contract design, master how to build platform-level contracts serving multiple alliances, and how to bidirectionally integrate with game servers.
Status: Architecture chapter. Main focus on multi-tenant design and Registry patterns.
18.1 What Are Multi-Tenant Contracts?
Single-tenant: One contract serves only one Owner (your alliance).
Multi-tenant: One contract after deployment can simultaneously serve multiple unrelated Owners (multiple alliances), with isolated data.
Single-tenant example (Example 1-5 pattern):
Contract → Dedicated TollGate (only your stargate)
Multi-tenant example:
Contract → Register Alliance A's stargate toll configuration
→ Register Alliance B's stargate toll configuration
→ Register Alliance C's storage box market configuration
→ (Each alliance isolated, data independent)
Use Cases: Building a “SaaS”-level tool that can be used by multiple alliances. Examples: universal auction platform, royalty market infrastructure, quest system framework.
Multi-tenancy is most easily misunderstood as “cramming many users into one contract.” What it really needs to solve is:
How to let many mutually untrusting operators share the same protocol capabilities, but without cross-contamination, unauthorized access, or data pollution.
So the core of multi-tenant design isn’t “saving deployment times,” but three things:
- Isolation Tenant A cannot touch Tenant B’s state
- Reuse Same logic doesn’t need to be repackaged for each alliance
- Operability Platform can continue maintenance, upgrades, and billing
18.2 Multi-Tenant Contract Design Pattern
module platform::multi_toll;
use sui::table::{Self, Table};
use sui::object::{Self, ID};
/// Platform registry (shared object, used by all tenants)
public struct TollPlatform has key {
id: UID,
registrations: Table<ID, TollConfig>, // gate_id → toll configuration
}
/// Each tenant's (stargate's) independent configuration
public struct TollConfig has store {
owner: address, // Owner of this configuration (stargate owner)
toll_amount: u64,
fee_recipient: address,
total_collected: u64,
}
/// Tenant registration (any Builder can register their stargate)
public fun register_gate(
platform: &mut TollPlatform,
gate: &Gate,
owner_cap: &OwnerCap<Gate>, // Prove you're this stargate's Owner
toll_amount: u64,
fee_recipient: address,
ctx: &TxContext,
) {
// Verify OwnerCap and Gate correspondence
assert!(owner_cap.authorized_object_id == object::id(gate), ECapMismatch);
let gate_id = object::id(gate);
assert!(!table::contains(&platform.registrations, gate_id), EAlreadyRegistered);
table::add(&mut platform.registrations, gate_id, TollConfig {
owner: ctx.sender(),
toll_amount,
fee_recipient,
total_collected: 0,
});
}
/// Adjust tenant configuration (only can modify your own configuration)
public fun update_toll(
platform: &mut TollPlatform,
gate: &Gate,
owner_cap: &OwnerCap<Gate>,
new_toll_amount: u64,
ctx: &TxContext,
) {
assert!(owner_cap.authorized_object_id == object::id(gate), ECapMismatch);
let config = table::borrow_mut(&mut platform.registrations, object::id(gate));
assert!(config.owner == ctx.sender(), ENotConfigOwner);
config.toll_amount = new_toll_amount;
}
/// Multi-tenant jump (toll logic reused, but configurations independently isolated)
public fun multi_tenant_jump(
platform: &mut TollPlatform,
source_gate: &Gate,
dest_gate: &Gate,
character: &Character,
mut payment: Coin<SUI>,
clock: &Clock,
ctx: &mut TxContext,
) {
// Read this stargate's dedicated toll configuration
let gate_id = object::id(source_gate);
assert!(table::contains(&platform.registrations, gate_id), EGateNotRegistered);
let config = table::borrow_mut(&mut platform.registrations, gate_id);
assert!(coin::value(&payment) >= config.toll_amount, EInsufficientPayment);
// Transfer to respective fee_recipient
let toll = payment.split(config.toll_amount, ctx);
transfer::public_transfer(toll, config.fee_recipient);
config.total_collected = config.total_collected + config.toll_amount;
// Return change
if coin::value(&payment) > 0 {
transfer::public_transfer(payment, ctx.sender());
} else {
coin::destroy_zero(payment);
};
// Issue jump permit
gate::issue_jump_permit(
source_gate, dest_gate, character, MultiTollAuth {}, clock.timestamp_ms() + 15 * 60 * 1000, ctx,
);
}
public struct MultiTollAuth has drop {}
const ECapMismatch: u64 = 0;
const EAlreadyRegistered: u64 = 1;
const ENotConfigOwner: u64 = 2;
const EGateNotRegistered: u64 = 3;
const EInsufficientPayment: u64 = 4;
What Multi-Tenant Design Really Needs to Decide First Is “What Is the Tenant Key”
In this example, gate_id serves as the tenant boundary. In reality, common tenant keys include:
- A certain
assembly_id - A certain
character_id - An alliance object ID
- A normalized business primary key
This choice is very critical, because it determines:
- How data is isolated
- How permissions are validated
- How frontend and indexing layers retrieve
If the tenant key is chosen unstably, you’ll frequently encounter dirty boundary problems like “is this one tenant or two.”
Three Most Common Types of Accidents in Multi-Tenant Contracts
1. Incomplete Isolation
Looks like multi-tenant, but certain paths still use global shared parameters, causing different alliances to affect each other.
2. Platform Parameters and Tenant Parameters Mixed Together
Result is:
- Some configurations should be globally unified
- But were privately changed by a tenant
Or vice versa:
- Fee rates that should be independent per tenant
- Made into global single values
3. Query Model Didn’t Keep Up
On-chain wrote multi-tenant structure, but frontend and indexing layers still only know how to read by “single object” thinking, ultimately the platform is unusable.
18.3 Game Server Integration Patterns
Pattern One: Server as Event Listener
// game-server/event-listener.ts
// Game server listens to on-chain events, updates game state
import { SuiClient } from "@mysten/sui/client";
const client = new SuiClient({ url: process.env.SUI_RPC! });
// Listen to player achievements, trigger in-game rewards
await client.subscribeEvent({
filter: { Package: MY_PACKAGE },
onMessage: async (event) => {
if (event.type.includes("AchievementUnlocked")) {
const { player, achievement_type } = event.parsedJson as any;
// Game server handles: grant in-game items to player
await gameServerAPI.grantItemToPlayer(player, achievement_type);
}
if (event.type.includes("GateJumped")) {
const { character_id, destination_gate_id } = event.parsedJson as any;
// Game server handles: teleport player to destination system
await gameServerAPI.teleportCharacter(character_id, destination_gate_id);
}
},
});
Pattern Two: Server as Data Provider
// game-server/api.ts
// Game server provides off-chain data, dApp calls
import express from "express";
const app = express();
// Provide star system name (decrypt location hash)
app.get("/api/location/:hash", async (req, res) => {
const { hash } = req.params;
const geoInfo = await locationDB.getByHash(hash);
res.json(geoInfo);
});
// Verify proximity (for Sponsor service to call)
app.post("/api/proximity/verify", async (req, res) => {
const { player_id, assembly_id, max_distance_km } = req.body;
const playerPos = await getPlayerPosition(player_id);
const assemblyPos = await getAssemblyPosition(assembly_id);
const distance = calculateDistance(playerPos, assemblyPos);
res.json({
is_near: distance <= max_distance_km,
distance_km: distance,
});
});
// Get player real-time game status
app.get("/api/character/:id/status", async (req, res) => {
const status = await gameServerAPI.getCharacterStatus(req.params.id);
res.json({
online: status.online,
system: status.current_system,
ship: status.current_ship,
fleet: status.fleet_id,
});
});
Pattern Three: Bidirectional State Synchronization
On-chain events ──────────────► Game server
(NFT minting, quest completion) (Update game world state)
Game server ──────────────► On-chain transactions
(Physics verification, sponsor signatures) (Record results, grant rewards)
Don’t Mix These Three Patterns Into One Pot
While all called “server integration,” their responsibilities are completely different:
- Event Listener Consumer-oriented, syncs on-chain results back to game world
- Data Provider Query-oriented, provides off-chain interpretation layer for frontend and backend
- Bidirectional Sync Collaboration-oriented, lets on-chain and game server mutually drive state changes
If you don’t layer them, you’ll easily end up with:
- One service managing listening, sponsoring, and all queries
- When problems occur, completely don’t know which chain broke
Most Critical Between Game Server and On-Chain Isn’t “Connectivity,” But “Calibration Consistency”
For example:
- Are on-chain recognized
assembly_idand game server recognized facility IDs the same thing - Do location hashes and off-chain map coordinates have one-to-one correspondence
- Are character IDs in events and character primary keys in game database stably mapped
Once these mappings drift, system surface still works, but business slowly becomes distorted.
18.4 ObjectRegistry: Global Query Table
When your contract has multiple shared objects, need a registry for other contracts and dApps to find them:
module platform::registry;
/// Global registry (like DNS)
public struct ObjectRegistry has key {
id: UID,
entries: Table<String, ID>, // name → ObjectID
}
/// Register a named object
public fun register(
registry: &mut ObjectRegistry,
name: vector<u8>,
object_id: ID,
_admin_cap: &AdminCap,
ctx: &TxContext,
) {
table::add(
&mut registry.entries,
std::string::utf8(name),
object_id,
);
}
/// Query
public fun resolve(registry: &ObjectRegistry, name: String): ID {
*table::borrow(®istry.entries, name)
}
// Query Treasury ID through registry
const registry = await getObjectWithJson(REGISTRY_ID);
const treasuryId = registry?.entries?.["alliance_treasury"];
Registry’s value isn’t just “conveniently lookup an ID,” but unifying “scattered object discovery logic.”
This will directly improve three things:
- Frontend doesn’t need to hardcode a bunch of object addresses
- Other contracts know where to find key objects
- After upgrades or migrations, can do smooth transitions through registry
But Registry Also Has Boundaries
Don’t treat it as a universal database. It’s best suited for:
- Name resolution
- Core object entry discovery
- Small amount of stable mappings
Not suited for:
- High-frequency changing large lists
- Heavy business statistics
- Large-scale time series data
🔖 Chapter Summary
| Knowledge Point | Core Takeaway |
|---|---|
| Multi-Tenant Contracts | Table isolates configuration by gate_id, any Builder can register |
| Server Roles | Event listening + data providing + proximity verification |
| Bidirectional Sync | On-chain events → Game state; Game verification → On-chain record |
| ObjectRegistry | Global name table, convenient for other contracts and dApps to find objects |
📚 Further Reading
Chapter 19: Full-Stack dApp Architecture Design
Goal: Design and implement production-grade EVE Frontier dApps, covering state management, real-time data updates, error handling, responsive design, and CI/CD automated deployment.
Status: Architecture chapter. Main focus on full-stack dApp organization, state management, and deployment.
19.1 Full-Stack Architecture Overview
┌─────────────────────────────────────────────────────┐
│ User Browser │
│ ┌──────────────────────────────────────────────┐ │
│ │ React / Next.js dApp │ │
│ │ ┌──────────┐ ┌──────────┐ ┌────────────┐ │ │
│ │ │ EVE Vault│ │React │ │ Tanstack │ │ │
│ │ │ Wallet │ │ dapp-kit │ │ Query │ │ │
│ │ └──────────┘ └──────────┘ └────────────┘ │ │
│ └──────────────────────────────────────────────┘ │
└─────────────────────┬───────────────────────────────┘
│
┌───────────┼────────────┐
▼ ▼ ▼
Sui Full Node Your Backend Game Server
GraphQL Sponsor Svc Location/Verify API
Event Stream Index Svc
What this diagram should convey most isn’t “there are many tech stacks,” but:
A truly usable EVE dApp is never a single-page frontend, but an entire layered collaborative system.
Each layer in this system solves different problems:
- Browser handles interaction and state feedback
- Wallet handles signing and identity
- Full node and GraphQL provide on-chain truth
- Backend handles sponsoring, risk control, aggregation
- Game server provides off-chain world interpretation and verification
If these responsibilities aren’t layered, system surface can run, but will become increasingly difficult to maintain.
19.2 Project Structure (Next.js Example)
dapp/
├── app/ # Next.js App Router
│ ├── layout.tsx # Global layout (Provider)
│ ├── page.tsx # Homepage
│ ├── gate/[id]/page.tsx # Stargate detail page
│ └── dashboard/page.tsx # Management panel
├── components/
│ ├── common/
│ │ ├── WalletButton.tsx
│ │ ├── TxStatus.tsx
│ │ └── LoadingSpinner.tsx
│ ├── gate/
│ │ ├── GateCard.tsx
│ │ ├── JumpPanel.tsx
│ │ └── TollInfo.tsx
│ └── market/
│ ├── ItemGrid.tsx
│ └── BuyButton.tsx
├── hooks/
│ ├── useGate.ts # Stargate data
│ ├── useMarket.ts # Market data
│ ├── useSponsoredAction.ts # Sponsored transactions
│ └── useEvents.ts # Real-time events
├── lib/
│ ├── sui.ts # SuiClient instance
│ ├── contracts.ts # Contract constants
│ ├── queries.ts # GraphQL queries
│ └── config.ts # Environment config
├── store/
│ └── useAppStore.ts # Zustand global state
└── .env.local
The Real Purpose of Directory Structure Isn’t “Looking Good,” But Preventing Responsibility Sprawl
Most common way to lose control is:
- Components directly stuffing on-chain requests
- Hooks directly writing business rules
- Pages directly assembling transaction details
- Global store stuffing all state
Short-term can run, long-term will be very difficult to change.
A more stable boundary is usually:
components/handles display and interactionhooks/handles page-level data flowlib/handles underlying client and query encapsulationstore/only puts truly cross-page shared local UI state
19.3 Global Provider Configuration
// app/layout.tsx
"use client";
import { QueryClient, QueryClientProvider } from "@tanstack/react-query";
import { SuiClientProvider, WalletProvider } from "@mysten/dapp-kit-react";
import { EveFrontierProvider } from "@evefrontier/dapp-kit";
import { getFullnodeUrl } from "@mysten/sui/client";
import { EVE_VAULT_WALLET } from "@evefrontier/dapp-kit";
const queryClient = new QueryClient({
defaultOptions: {
queries: {
staleTime: 30_000, // Don't re-request within 30 seconds
refetchInterval: false,
retry: 2,
},
},
});
const networks = {
testnet: { url: getFullnodeUrl("testnet") },
mainnet: { url: getFullnodeUrl("mainnet") },
};
export default function RootLayout({ children }: { children: React.ReactNode }) {
return (
<html lang="en">
<body>
<QueryClientProvider client={queryClient}>
<SuiClientProvider networks={networks} defaultNetwork="testnet">
<WalletProvider wallets={[EVE_VAULT_WALLET]} autoConnect>
<EveFrontierProvider>
{children}
</EveFrontierProvider>
</WalletProvider>
</SuiClientProvider>
</QueryClientProvider>
</body>
</html>
);
}
Provider Chain Is Actually Declaring Entire App’s Runtime Dependency Order
This isn’t a formality issue. Once order is wrong, common consequences include:
- Wallet context can’t get client
- Query cache invalidation doesn’t work as expected
- dapp-kit can’t read needed environment
So global Provider should be as stable as possible, don’t frequently change during business iteration.
19.4 State Management (Zustand + React Query)
// store/useAppStore.ts
import { create } from "zustand";
interface AppStore {
selectedGateId: string | null;
txPending: boolean;
txDigest: string | null;
setSelectedGate: (id: string | null) => void;
setTxPending: (pending: boolean) => void;
setTxDigest: (digest: string | null) => void;
}
export const useAppStore = create<AppStore>((set) => ({
selectedGateId: null,
txPending: false,
txDigest: null,
setSelectedGate: (id) => set({ selectedGateId: id }),
setTxPending: (pending) => set({ txPending: pending }),
setTxDigest: (digest) => set({ txDigest: digest }),
}));
// hooks/useGate.ts
import { useQuery, useMutation, useQueryClient } from "@tanstack/react-query";
import { useCurrentClient } from "@mysten/dapp-kit-react";
import { Transaction } from "@mysten/sui/transactions";
export function useGate(gateId: string) {
const client = useCurrentClient();
return useQuery({
queryKey: ["gate", gateId],
queryFn: async () => {
const obj = await client.getObject({
id: gateId,
options: { showContent: true },
});
return obj.data?.content?.dataType === "moveObject"
? obj.data.content.fields
: null;
},
refetchInterval: 15_000,
});
}
export function useJumpGate(gateId: string) {
const queryClient = useQueryClient();
const { signAndExecuteSponsoredTransaction } = useSponsoredAction();
return useMutation({
mutationFn: async (characterId: string) => {
const tx = new Transaction();
tx.moveCall({
target: `${TOLL_PACKAGE}::toll_gate_ext::pay_toll_and_get_permit`,
arguments: [/* ... */],
});
return signAndExecuteSponsoredTransaction(tx);
},
onSuccess: () => {
// After successful transaction, invalidate related queries (trigger reload)
queryClient.invalidateQueries({ queryKey: ["gate", gateId] });
queryClient.invalidateQueries({ queryKey: ["treasury"] });
},
});
}
React Query and Zustand Don’t Mix Responsibilities
A very practical division of labor is:
- React Query Manages on-chain data, remote data, cache, invalidation and refetch
- Zustand Manages local UI state, e.g., currently selected item, modals, temporary input
Once you stuff on-chain objects into Zustand, or force pure UI state into Query cache, it will almost certainly become messy later.
A Mature dApp Has At Least Three Layers of State
- Remote Truth State On-chain objects, index results, game server API returns
- Local Interaction State Forms, hover, loading, modals
- Transaction State Signing, submitted, confirmed, failed
These three layers of state update at different rhythms, shouldn’t be mixed into one layer.
19.5 Real-Time Data Push
// hooks/useEvents.ts
import { useEffect, useRef, useState } from "react";
import { useCurrentClient } from "@mysten/dapp-kit-react";
export function useRealtimeEvents<T>(
eventType: string,
options?: { maxEvents?: number }
) {
const client = useCurrentClient();
const [events, setEvents] = useState<T[]>([]);
const unsubRef = useRef<(() => void) | null>(null);
const maxEvents = options?.maxEvents ?? 50;
useEffect(() => {
const subscribe = async () => {
unsubRef.current = await client.subscribeEvent({
filter: { MoveEventType: eventType },
onMessage: (event) => {
setEvents((prev) => [event.parsedJson as T, ...prev].slice(0, maxEvents));
},
});
};
subscribe();
return () => { unsubRef.current?.(); };
}, [client, eventType, maxEvents]);
return events;
}
// Usage
function JumpFeed() {
const jumps = useRealtimeEvents<{character_id: string; toll_paid: string}>(
`${TOLL_PACKAGE}::toll_gate_ext::GateJumped`
);
return (
<ul>
{jumps.map((j, i) => (
<li key={i}>
{j.character_id.slice(0, 8)}... paid {Number(j.toll_paid) / 1e9} SUI
</li>
))}
</ul>
);
}
Don’t Use Real-Time Streams to Replace Complete Data Loading
It’s better suited for:
- Incremental feeds
- Notifications and alerts
- Partial activity information
Rather than directly serving as page initial data source. More stable strategy is usually:
- Page first loads current snapshot
- Then receives event stream for incremental updates
- Periodically or on-demand do consistency refresh
19.6 Error Handling and User Experience
// components/common/TxButton.tsx
import { useState } from "react";
interface TxButtonProps {
onClick: () => Promise<void>;
children: React.ReactNode;
disabled?: boolean;
}
export function TxButton({ onClick, children, disabled }: TxButtonProps) {
const [status, setStatus] = useState<"idle" | "pending" | "success" | "error">("idle");
const [message, setMessage] = useState("");
const handleClick = async () => {
setStatus("pending");
setMessage("⏳ Submitting...");
try {
await onClick();
setStatus("success");
setMessage("✅ Transaction successful!");
setTimeout(() => setStatus("idle"), 3000);
} catch (e: any) {
setStatus("error");
// Parse Move abort error code to human-readable message
const abortCode = extractAbortCode(e.message);
setMessage(`❌ ${translateError(abortCode) ?? e.message}`);
}
};
return (
<div>
<button
onClick={handleClick}
disabled={disabled || status === "pending"}
className={`tx-btn tx-btn--${status}`}
>
{status === "pending" ? "⏳ Processing..." : children}
</button>
{message && <p className={`message message--${status}`}>{message}</p>}
</div>
);
}
// Translate Move abort error code to friendly message
function translateError(code: number | null): string | null {
const errors: Record<number, string> = {
0: "Insufficient permissions, please confirm wallet is connected",
1: "Insufficient balance",
2: "Item already sold",
3: "Stargate offline",
};
return code !== null ? errors[code] ?? null : null;
}
function extractAbortCode(message: string): number | null {
const match = message.match(/abort_code: (\d+)/);
return match ? parseInt(match[1]) : null;
}
19.7 CI/CD Automated Deployment
# .github/workflows/deploy.yml
name: Deploy dApp
on:
push:
branches: [main]
pull_request:
branches: [main]
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with: { node-version: "20" }
- run: npm ci
- run: npm run test
- run: npm run build
deploy-preview:
if: github.event_name == 'pull_request'
needs: test
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- run: npm ci && npm run build
env:
VITE_SUI_RPC_URL: ${{ vars.TESTNET_RPC_URL }}
VITE_WORLD_PACKAGE: ${{ vars.TESTNET_WORLD_PACKAGE }}
- uses: amondnet/vercel-action@v25
with:
vercel-token: ${{ secrets.VERCEL_TOKEN }}
vercel-org-id: ${{ secrets.VERCEL_ORG_ID }}
vercel-project-id: ${{ secrets.VERCEL_PROJECT_ID }}
deploy-prod:
if: github.ref == 'refs/heads/main'
needs: test
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- run: npm ci && npm run build
env:
VITE_SUI_RPC_URL: ${{ vars.MAINNET_RPC_URL }}
VITE_WORLD_PACKAGE: ${{ vars.MAINNET_WORLD_PACKAGE }}
- uses: amondnet/vercel-action@v25
with:
vercel-token: ${{ secrets.VERCEL_TOKEN }}
vercel-args: "--prod"
🔖 Chapter Summary
| Architecture Component | Tech Choice | Responsibility |
|---|---|---|
| UI Framework | React + Next.js | Page rendering, routing |
| On-Chain Communication | @mysten/dapp-kit + SuiClient | Read chain/sign/send transactions |
| State Management | Zustand (global) + React Query (server) | Cache and sync |
| Real-Time Updates | subscribeEvent (WebSocket) | Event push |
| Error Handling | abort code translation + state machine | User-friendly prompts |
| CI/CD | GitHub Actions + Vercel | Automated testing and deployment |
📚 Further Reading
Chapter 20: In-Game dApp Integration (Overlay UI and Event Communication)
Objective: Master how to embed your dApp into the EVE Frontier game client as a floating panel, enabling seamless interaction between in-game and on-chain data, and initiating signing requests from within the game without switching to an external browser.
Status: Integration chapter. Main content focuses on in-game WebView, overlay UI, and event communication.
20.1 Two dApp Access Modes
EVE Frontier supports two ways to access your dApp:
| Mode | Entry Point | Suitable Scenarios |
|---|---|---|
| External Browser | Player manually opens webpage | Admin panels, data analytics, settings pages |
| In-Game Overlay | Embedded WebView in game client | Transaction popups, real-time status, combat assistance |
In-game integration provides a smoother user experience: players can complete purchases, check inventory, and sign transactions without leaving the game.
The most important point of this chapter is not “WebView can also open webpages,” but rather:
The same dApp actually plays different roles when accessed in-game versus in an external browser.
External browser is more like a complete backend:
- High information density
- Longer operation chains
- Suitable for management, analysis, configuration
In-game overlay is more like an instant tool:
- Must be fast
- Must be brief
- Must be strongly relevant to the current context
If you make both entry points exactly the same, typically both experiences will suffer.
20.2 How In-Game WebView Works
EVE Frontier client has a built-in Chromium WebView that can load external URLs:
Game Client (Unity/Electron)
└── WebView Component
└── Load your dApp URL (https://your-dapp.com)
└── Communicate with EVE Vault (injected in-game)
Key Point: EVE Vault is injected into the game’s WebView window object and shares the same Wallet Standard API as the external browser extension, so the same @mysten/dapp-kit code requires no modification to run in both modes.
But “API compatibility” doesn’t equal “experience equivalence”
Technically you can reuse the same wallet integration code, but that doesn’t mean you can blindly copy-paste the entire product flow.
In-game environments typically face additional constraints:
- Smaller page space
- Shorter player attention span
- Operations may occur while in combat or moving
- Host environment decides open/close timing
So what should actually be reused is the underlying capability, not the entire interaction rhythm.
20.3 Detecting Current Runtime Environment
Your dApp needs to know whether it’s running in-game or in an external browser to make appropriate UI adjustments:
// lib/environment.ts
export type RunEnvironment = "in-game" | "external-browser" | "unknown";
export function detectEnvironment(): RunEnvironment {
// EVE Frontier client injects an identifier in WebView's navigator.userAgent
const ua = navigator.userAgent;
if (ua.includes("EVEFrontier/GameClient")) {
return "in-game";
}
// Can also detect via custom query parameter
const params = new URLSearchParams(window.location.search);
if (params.get("env") === "ingame") {
return "in-game";
}
return "external-browser";
}
export const isInGame = detectEnvironment() === "in-game";
// App.tsx
import { isInGame } from "./lib/environment";
export function App() {
return (
<div className={`app ${isInGame ? "app--ingame" : "app--external"}`}>
{isInGame ? <InGameOverlay /> : <FullDashboard />}
</div>
);
}
What does environment detection really serve?
It’s not just to set an isInGame flag, but to help the page decide:
- Which layout should currently be rendered
- Whether certain buttons should be hidden
- Whether to listen to the game event bridge
- Whether certain complex operations should redirect to external browser
In other words, environment detection is not a presentation layer trick, but part of interaction routing.
20.4 In-Game Overlay UI Design Principles
In-game UI has different design requirements from external web pages:
| External Browser | In-Game Overlay |
|---|---|
| Full-screen layout | Small window (typically 400×600px) |
| Standard font size | Larger fonts, high contrast |
| Hover tooltips | Avoid hover (uncertain if focus is on game or UI) |
| Multi-step forms | Single-step operations, minimize input |
| Non-streaming animations | Lightweight animations (prevent blocking game view) |
/* ingame.css - In-game overlay exclusive styles */
:root {
--ingame-bg: rgba(10, 15, 25, 0.92);
--ingame-border: rgba(80, 160, 255, 0.4);
--ingame-text: #e0e8ff;
--ingame-accent: #4fa3ff;
}
.app--ingame {
width: 420px;
min-height: 100vh;
background: var(--ingame-bg);
color: var(--ingame-text);
border: 1px solid var(--ingame-border);
backdrop-filter: blur(8px);
font-size: 15px; /* Slightly larger than standard */
font-family: 'Share Tech Mono', monospace; /* EVE style font */
}
/* Ensure buttons are large enough for mouse clicks (in-game precision requirements) */
.ingame-btn {
min-height: 44px;
min-width: 140px;
font-size: 14px;
letter-spacing: 0.05em;
text-transform: uppercase;
}
/* Hide non-essential horizontal navigation */
.app--ingame .sidebar-nav { display: none; }
.app--ingame .header-nav { display: none; }
Most common mistakes with in-game overlays
1. Forcing a backend page into an overlay
The result is:
- Information density too high
- Buttons too small
- User has no idea what the most important action is
2. Making confirmation flows too long
In-game is suitable for:
- Single-step confirmation
- Immediate operations on current object
- Strongly context-relevant actions
Not suitable for:
- Long forms
- Multi-page setup wizards
- Complex filtering backends
3. Visually too “webpage-like,” not enough “embedded tool-like”
Overlays should look more like a control panel for the current facility, not an independent website homepage.
20.5 Game Event Listening (postMessage Bridge)
Game client sends in-game events to WebView via window.postMessage:
// lib/gameEvents.ts
export type GameEvent =
| { type: "PLAYER_ENTERED_RANGE"; assemblyId: string; distance: number }
| { type: "PLAYER_LEFT_RANGE"; assemblyId: string }
| { type: "INVENTORY_CHANGED"; characterId: string }
| { type: "SYSTEM_CHANGED"; fromSystem: string; toSystem: string };
type GameEventHandler = (event: GameEvent) => void;
const handlers = new Set<GameEventHandler>();
// Start listener (call once at app startup)
export function startGameEventListener() {
window.addEventListener("message", (e) => {
// Only handle messages from game client (verify via origin or agreed source field)
if (e.data?.source !== "EVEFrontierClient") return;
const event = e.data as { source: string } & GameEvent;
if (!event.type) return;
for (const handler of handlers) {
handler(event);
}
});
}
export function onGameEvent(handler: GameEventHandler) {
handlers.add(handler);
return () => handlers.delete(handler); // Return unsubscribe function
}
The most important part of event bridge is not “can receive messages,” but stable message semantics
A mature message bridge protocol should at least ensure:
- Stable event types
- Stable field names and meanings
- Frontend can safely degrade when fields are missing
- Both frontend and backend know which events are one-time triggers vs. state syncs
Otherwise, when the game client changes a field, the frontend will fail silently in the most difficult environment to debug.
Using Game Events in React
// hooks/useGameEvents.ts
import { useEffect } from "react";
import { onGameEvent, GameEvent } from "../lib/gameEvents";
export function useGameEvent<T extends GameEvent["type"]>(
type: T,
handler: (event: Extract<GameEvent, { type: T }>) => void,
) {
useEffect(() => {
return onGameEvent((event) => {
if (event.type === type) {
handler(event as Extract<GameEvent, { type: T }>);
}
});
}, [type, handler]);
}
// Use case: Auto-open ticket panel when player enters stargate range
function GatePanel() {
const [nearGate, setNearGate] = useState<string | null>(null);
useGameEvent("PLAYER_ENTERED_RANGE", (event) => {
setNearGate(event.assemblyId);
});
useGameEvent("PLAYER_LEFT_RANGE", () => {
setNearGate(null);
});
if (!nearGate) return null;
return <JumpTicketPanel gateId={nearGate} />;
}
Don’t treat game events as on-chain truth
Event bridges are best suited for:
- Current context prompts
- UI popup/close
- Current object context switching
But actions truly involving assets and permissions should still rely on on-chain objects and formal verification processes.
In other words:
- Game events tell you “the player probably wants to operate on this object now”
- On-chain data tells you “what state this object is actually in right now”
20.6 Initiating Signing Requests from In-Game
Since EVE Vault is injected in-game, signing requests directly trigger the game’s built-in Vault UI:
// components/InGameMarket.tsx
import { useDAppKit } from "@mysten/dapp-kit-react";
import { Transaction } from "@mysten/sui/transactions";
export function InGameMarket({ gateId }: { gateId: string }) {
const dAppKit = useDAppKit();
const [status, setStatus] = useState("");
const handleBuy = async () => {
setStatus("Please confirm transaction in top-right wallet...");
const tx = new Transaction();
tx.moveCall({
target: `${TOLL_PKG}::toll_gate_ext::pay_toll_and_get_permit`,
arguments: [/* ... */],
});
try {
// Signing request triggers game's built-in EVE Vault popup
const result = await dAppKit.signAndExecuteTransaction({
transaction: tx,
});
setStatus("✅ Permit issued!");
} catch (e: any) {
if (e.message?.includes("User rejected")) {
setStatus("❌ Cancelled");
} else {
setStatus(`❌ ${e.message}`);
}
}
};
return (
<div className="ingame-market">
<div className="gate-info">
<span>⛽ Toll: 10 SUI</span>
<span>⏱ Validity: 30 minutes</span>
</div>
<button className="ingame-btn" onClick={handleBuy}>
🚀 Purchase Permit
</button>
{status && <p className="status">{status}</p>}
</div>
);
}
The key to in-game signing experience is not “can sign,” but “don’t interrupt user flow”
The best in-game signing flows typically have these characteristics:
- Clearly communicate key costs before signing
- Can quickly return to original context after failure
- Immediately show current object state change after success
If users feel like signing is suddenly switching out to do an external wallet task, the value of in-game integration drops significantly.
20.7 Responsive Switching: Same Codebase Adapts to Both Scenarios
// App.tsx complete example
import { isInGame } from "./lib/environment";
import { startGameEventListener } from "./lib/gameEvents";
import { useEffect } from "react";
export function App() {
useEffect(() => {
if (isInGame) startGameEventListener();
}, []);
return (
<EveFrontierProvider>
{isInGame ? (
// In-game: Streamlined single-function overlay
<InGameOverlay />
) : (
// External browser: Full-featured dashboard
<FullDashboard />
)}
</EveFrontierProvider>
);
}
20.8 In-Game dApp URL Configuration
Provide players with the correct URL to add custom dApps in game settings:
Your dApp address (opens in game WebView):
https://your-dapp.com?env=ingame
# Or add via game client's "Custom Panel" feature
# Game will automatically attach EVEFrontier/GameClient identifier in User-Agent
🔖 Chapter Summary
| Knowledge Point | Core Takeaway |
|---|---|
| Two access modes | External browser (complete) vs in-game WebView (streamlined) |
| Environment detection | navigator.userAgent or query parameter detection |
| UI adaptation | Small window, large fonts, single-step operations, high contrast |
| Game event listening | window.postMessage + event dispatcher |
| Seamless signing integration | EVE Vault injected in-game, identical API |
| Responsive switching | Same codebase, isInGame conditional rendering |
📚 Further Reading
- dapp-kit documentation
- EVE Vault Introduction
- Chapter 5: dApp Frontend Development
- Chapter 19: Full-Stack dApp Architecture
Chapter 21: Performance Optimization and Gas Minimization
Objective: Master performance optimization techniques for on-chain operations, maximize off-chain computation, and build efficient, low-cost EVE Frontier applications through batching, object design optimization, and Gas budget control.
Status: Engineering chapter. Main content focuses on Gas, batching, and object design optimization.
21.1 Gas Cost Model
Sui’s Gas consists of two components:
Gas Fee = (Computation Units + Storage Delta) × Gas Price
- Computation Units: Move code execution consumption
- Storage Delta: Net increase in on-chain storage (new bytes charged, deleted bytes refunded)
Key Insights:
- Reading data is free (GraphQL/RPC reads don’t go on-chain)
- Adding/removing dynamic fields has significant Gas cost
- Emitting events is almost free (doesn’t occupy on-chain storage)
The easiest way to go wrong with Gas optimization is: many people immediately focus on “how to save a few units,” without first seeing clearly:
What’s truly expensive is often not a particular line of code, but those things your entire state model forces the system to do repeatedly.
So performance optimization is best viewed in three layers:
- Transaction Layer Can this transaction be merged, is it doing many small actions repeatedly
- Object Layer Are your objects too large, too hot, too centralized
- Architecture Layer Which computations and aggregations shouldn’t be on-chain at all
21.1.1 A Reusable Gas Comparison Record Template
This chapter easily becomes just slogans. It’s recommended to record “before/after” data for at least one fixed set of operations:
| Operation | Inefficient Approach | Optimized Approach | Fields You Should Record |
|---|---|---|---|
| Two stargates online + link | 3 separate transactions | 1 PTB batch | gasUsed, object writes count, total time |
| Market create listing | Append to large object vector | Independent object or dynamic field | Object size, write count, storage rebate |
| History records | Persist to shared object | Emit event + off-chain indexing | Event count, object growth bytes |
These numbers don’t need to pursue “absolute standard values,” but you must keep comparison records under the same environment, otherwise optimization conclusions have no persuasive power.
21.2 Batching: Do Multiple Things in One Transaction
Sui’s Programmable Transaction Blocks (PTB) allow executing multiple Move calls in one transaction:
// ❌ Inefficient: 3 separate transactions
await client.signAndExecuteTransaction({ transaction: tx_online }); // Online gate 1
await client.signAndExecuteTransaction({ transaction: tx_online }); // Online gate 2
await client.signAndExecuteTransaction({ transaction: tx_link }); // Link gates
// ✅ Efficient: 1 transaction completes all operations
const tx = new Transaction();
// Borrow OwnerCap (once)
const [ownerCap1, receipt1] = tx.moveCall({ target: `${PKG}::character::borrow_owner_cap`, ... });
const [ownerCap2, receipt2] = tx.moveCall({ target: `${PKG}::character::borrow_owner_cap`, ... });
// Execute all operations
tx.moveCall({ target: `${PKG}::gate::online`, arguments: [gate1, ownerCap1, ...] });
tx.moveCall({ target: `${PKG}::gate::online`, arguments: [gate2, ownerCap2, ...] });
tx.moveCall({ target: `${PKG}::gate::link`, arguments: [gate1, gate2, ...] });
// Return OwnerCap
tx.moveCall({ target: `${PKG}::character::return_owner_cap`, arguments: [..., receipt1] });
tx.moveCall({ target: `${PKG}::character::return_owner_cap`, arguments: [..., receipt2] });
await client.signAndExecuteTransaction({ transaction: tx });
// Save 2/3 of base Gas fee!
21.2.1 How to Record a Real Gas Comparison
- First fix inputs: same network, same object count, same batch of operations
- Record inefficient version execution results:
digest,gasUsed, write object count ineffects - Then execute PTB version, record same fields
- Organize results into a comparison table, write into your release or optimization notes
Recommended to record at least these fields:
- digest
- computationCost
- storageCost
- storageRebate
- nonRefundableStorageFee
- changedObjects count
PTB is not “merge everything you can”
Batching is powerful, but it’s not best to blindly stuff all actions into one transaction.
Suitable for merging:
- Steps that are already strongly related
- Flows that must atomically succeed or fail together
- Operations borrowing the same type of permission object multiple times
Not necessarily suitable for over-merging:
- Stuffing too much unrelated logic into one transaction
- Hard to pinpoint problems once failure occurs
- Gas budget and computation become unpredictable
So PTB’s goal is not “maximize length,” but “converge a flow that should truly be atomic.”
21.3 Object Design Optimization
Principle One: Avoid Large Objects
// ❌ Put all data in one object (max 250KB)
public struct BadMarket has key {
id: UID,
listings: vector<Listing>, // Object grows as products increase
bid_history: vector<BidRecord>, // History data grows infinitely
}
// ✅ Use dynamic fields or independent objects for distributed storage
public struct GoodMarket has key {
id: UID,
listing_count: u64, // Only store counter
// Specific Listing stored via dynamic fields: df::add(id, item_id, listing)
}
Principle Two: Delete Objects No Longer Needed (Get Storage Rebate)
// After auction ends, delete Listing to get Gas refund
public fun end_auction(auction: DutchAuction) {
let DutchAuction { id, .. } = auction;
id.delete(); // Delete object → storage rebate
}
// After claiming, delete DividendClaim object
public fun close_claim_record(record: DividendClaim) {
let DividendClaim { id, .. } = record;
id.delete();
}
Principle Three: Use u8/u16 Instead of u64 for Small Integers
// ❌ Waste space
public struct Config has key {
id: UID,
tier: u64, // Only stores 1-5, but occupies 8 bytes
status: u64, // Only stores 0-3, but occupies 8 bytes
}
// ✅ Compact storage
public struct Config has key {
id: UID,
tier: u8, // Only occupies 1 byte
status: u8, // Only occupies 1 byte
}
Why is object design almost always the root cause of performance issues?
Because on Sui, performance and object model are tied together:
- Larger objects mean heavier reads/writes
- Hotter shared objects mean higher contention
- More centralized state means harder to scale
So many performance optimizations end up not “rewriting algorithms,” but “refactoring object boundaries.”
A very practical criterion
As long as an object has both of these characteristics, you should start to be alert:
- Frequently written
- Still growing
Such objects will almost certainly become performance hotspots.
21.4 Off-Chain Computation, On-Chain Verification
Golden Rule: All computation that doesn’t need enforcement should be done off-chain.
// ❌ Sort on-chain (extremely Gas-consuming)
public fun get_top_bidders(auction: &Auction, n: u64): vector<address> {
let mut sorted = vector::empty<BidRecord>();
// ... O(n²) sorting, executed on-chain every time
}
// ✅ Store raw data on-chain, sort off-chain
public fun get_bid_at(auction: &Auction, index: u64): BidRecord {
*df::borrow<u64, BidRecord>(&auction.id, index)
}
// dApp or backend reads all bids, sorts in memory, displays leaderboard
Complex Routing Computation Done Off-Chain
// Example: Stargate logistics routing (off-chain optimal path calculation)
function findOptimalRoute(
start: string,
end: string,
gateGraph: Map<string, string[]>, // gate_id → [connected_gate_ids]
): string[] {
// Dijkstra or other path algorithms, executed in dApp/backend
// After calculating optimal path, only submit final jump operations on-chain
return dijkstra(gateGraph, start, end);
}
Off-chain computation is not cutting corners, but proper division of labor
Many things suitable for off-chain are essentially not “unimportant,” but rather:
- Results need display, but don’t need on-chain enforcement
- Algorithm is complex, but ultimately only need to submit a conclusion
- Can be recalculated, cached, replaced
If such work is forced on-chain, it only raises cost and failure surface together.
When must you verify on-chain?
When results affect:
- Asset ownership
- Permission authorization
- Amount settlement
- Scarce resource allocation
Then you must put key conclusions back on-chain for verification, not just trust off-chain calculations.
21.5 Gas Budget Setting
const tx = new Transaction();
// Set Gas budget cap (prevent unexpected excess consumption)
tx.setGasBudget(10_000_000); // 10 SUI cap
// Or use dryRun to estimate Gas
const estimate = await client.dryRunTransactionBlock({
transactionBlock: await tx.build({ client }),
});
console.log("Estimated Gas:", estimate.effects.gasUsed);
The most valuable part of dryRun is not “estimating a number,” but discovering model issues early
If a transaction’s dry run results already show:
- Many write objects
- Abnormally high storage cost
- Very little refund
That usually means the problem is not in the budget, but in the structure itself.
21.6 Parallel Execution: Contention-Free Shared Object Design
Sui can execute transactions operating on different objects in parallel. Contention for the same shared object causes sequential execution:
// ❌ All users contend for the same Market object
Market (shared) ← All purchase transactions need write lock → Sequential execution
(High traffic causes queue buildup, latency increases)
// ✅ Sharding design (multiple SubMarkets)
Market_Shard_0 (shared) ← Transactions where item type_id % 4 == 0
Market_Shard_1 (shared) ← Transactions where item type_id % 4 == 1
Market_Shard_2 (shared) ← Transactions where item type_id % 4 == 2
Market_Shard_3 (shared) ← Transactions where item type_id % 4 == 3
(4 shards execute in parallel, throughput ×4)
// Shard routing
public fun buy_item_sharded(
shards: &mut vector<MarketShard>,
item_type_id: u64,
payment: Coin<SUI>,
ctx: &mut TxContext,
) {
let shard_index = item_type_id % vector::length(shards);
let shard = vector::borrow_mut(shards, shard_index);
buy_from_shard(shard, item_type_id, payment, ctx);
}
The most important question in concurrency design
Not “can it be parallel,” but:
In my business flow, which states must contend for the same shared object, and which can naturally be split?
For example, in a market system, common dimensions that can be split include:
- Item type
- Region
- Tenant
- Time bucket
As long as the split dimension is chosen correctly, throughput typically improves significantly.
Sharding also has costs
Don’t treat sharding as a free lunch. It brings:
- More complex query aggregation
- More complex routing logic
- Frontend and indexing layers need to know sharding rules
So sharding is a clear trade-off of “increasing system complexity for throughput,” not the default option.
🔖 Chapter Summary
| Optimization Technique | Savings Ratio |
|---|---|
| PTB batching (merge multiple transactions) | 30-70% base fee |
| Off-chain computation, on-chain verification | Eliminate complex computation Gas |
| Delete obsolete objects | Get storage rebate |
| Compact data types (u8 vs u64) | Reduce object size |
| Shard shared objects | Increase concurrent throughput |
📚 Further Reading
Chapter 22: Advanced Move Patterns — Upgrade Compatibility Design
Objective: Master production-grade Move contract upgrade compatibility architecture, including versioned APIs, data migration, Policy control, and smooth upgrades without service interruption.
Status: Advanced design chapter. Main content focuses on upgrade compatibility, migration, and timelock control.
22.1 The Essence of Upgrade Compatibility Issues
Move contract upgrades face two core constraints:
Constraint 1: Struct definitions cannot be modified (cannot add/remove fields, cannot change field types)
Constraint 2: Function signatures cannot be modified (parameters and return values cannot change)
BUT:
✅ Can add new functions
✅ Can add new modules
✅ Can modify function internal logic (without changing signature)
✅ Can add new structs
Challenge: If your contract v1 has a Market struct and v2 wants to add an expiry_ms field, you cannot modify it directly.
What this upgrade compatibility chapter really needs to solve is not “how to release a new version,” but:
How to keep a system that’s already depended upon by objects, frontends, scripts, and users alive.
So the upgrade problem is essentially a four-layer compatibility problem:
- On-chain object compatibility
- On-chain interface compatibility
- Frontend parsing compatibility
- Operations process compatibility
22.2 Extension Pattern: Use Dynamic Fields to Add “Future Fields”
Best Practice: Reserve extension space for future fields in advance:
module my_market::market_v1;
/// Current fields
public struct Market has key {
id: UID,
toll: u64,
owner: address,
// Note: Don't try to predict future needed fields — because you can't change them
// Instead, rely on dynamic fields for extension
}
// V1 → V2: Add expiry_ms via dynamic field
// (Called in migration script after package upgrade)
public fun add_expiry_field(
market: &mut Market,
expiry_ms: u64,
) {
// Only add if this field doesn't exist yet
if !df::exists_(&market.id, b"expiry_ms") {
df::add(&mut market.id, b"expiry_ms", expiry_ms);
}
}
/// V2 version reads expiry (backward compatible: returns default when old objects lack this field)
public fun get_expiry(market: &Market): u64 {
if df::exists_(&market.id, b"expiry_ms") {
*df::borrow<vector<u8>, u64>(&market.id, b"expiry_ms")
} else {
0 // Default: never expires
}
}
Why do dynamic fields become an upgrade escape hatch?
Because they let you supplement old objects with new semantics without changing the original struct layout.
But they also have boundaries:
- Suitable for appending fields
- Not suitable for cramming all future complex structures in
If a version upgrade requires patching many temporary fields onto objects, that usually means you should rethink the model, not rely infinitely on patch-style extensions.
22.3 Versioned API Design
When you need to change function behavior, keep the old version and add a new version:
module my_market::market;
/// V1 API (always maintain backward compatibility)
public fun buy_item_v1(
market: &mut Market,
payment: Coin<SUI>,
item_type_id: u64,
ctx: &mut TxContext,
): Item {
// Original logic
}
/// V2 API (new feature: supports discount codes)
public fun buy_item_v2(
market: &mut Market,
payment: Coin<SUI>,
item_type_id: u64,
discount_code: Option<vector<u8>>, // New parameter
clock: &Clock, // New parameter (time validation)
ctx: &mut TxContext,
): Item {
// New logic (includes discount processing)
let effective_price = apply_discount(market, item_type_id, discount_code, clock);
// ...
}
dApp Adaptation: Check contract version on TypeScript side, choose which function to call:
async function buyItem(useV2: boolean, ...) {
const tx = new Transaction();
if (useV2) {
tx.moveCall({ target: `${PKG}::market::buy_item_v2`, ... });
} else {
tx.moveCall({ target: `${PKG}::market::buy_item_v1`, ... });
}
}
Why “keeping old entry points” is often more stable than “forcing complete migration”?
Because callers of production systems are never just yourself:
- Old frontend still running
- User scripts may still be using
- Third-party aggregators may not have upgraded yet
So the most stable upgrade path is often not “one-cut replacement,” but:
- Old and new coexist
- Give migration window
- Gradually retire old interfaces
22.4 Upgrade Locking Strategy
For high-value contracts, you can add timelocks on UpgradeCap:
module my_gov::upgrade_timelock;
use sui::package::UpgradeCap;
use sui::clock::Clock;
public struct TimelockWrapper has key {
id: UID,
upgrade_cap: UpgradeCap,
delay_ms: u64, // Wait time required before upgrade announcement
announced_at_ms: u64, // Announcement time (0 = not announced)
}
/// Step 1: Announce upgrade intent (start timer)
public fun announce_upgrade(
wrapper: &mut TimelockWrapper,
_admin: &AdminCap,
clock: &Clock,
) {
assert!(wrapper.announced_at_ms == 0, EAlreadyAnnounced);
wrapper.announced_at_ms = clock.timestamp_ms();
}
/// Step 2: Can only execute upgrade after delay period
public fun authorize_upgrade(
wrapper: &mut TimelockWrapper,
clock: &Clock,
): &mut UpgradeCap {
assert!(wrapper.announced_at_ms > 0, ENotAnnounced);
assert!(
clock.timestamp_ms() >= wrapper.announced_at_ms + wrapper.delay_ms,
ETimelockNotExpired,
);
// Reset, next upgrade requires re-announcement
wrapper.announced_at_ms = 0;
&mut wrapper.upgrade_cap
}
TimeLock truly protects not code, but trust relationships
It gives community, collaborators, and users an observation window, so upgrades don’t become “admin can change whatever they want tonight.”
This is very critical in high-value protocols, because upgrade risks are often not technical bugs, but governance risks.
22.5 Large-Scale Data Migration Strategy
When needing to rebuild storage structure, adopt “incremental migration” rather than “one-time migration”:
// Scenario: Migrate ListingsV1 (vector) to ListingsV2 (Table)
module migration::market_migration;
public struct MigrationState has key {
id: UID,
migrated_count: u64,
total_count: u64,
is_complete: bool,
}
/// Migrate one batch at a time (avoid exceeding computation limit in one transaction)
public fun migrate_batch(
old_market: &mut MarketV1,
new_market: &mut MarketV2,
state: &mut MigrationState,
batch_size: u64, // Process batch_size records each time
ctx: &TxContext,
) {
let start = state.migrated_count;
let end = min(start + batch_size, state.total_count);
let mut i = start;
while (i < end) {
let listing = get_listing_v1(old_market, i);
insert_listing_v2(new_market, listing);
i = i + 1;
};
state.migrated_count = end;
if end == state.total_count {
state.is_complete = true;
};
}
Migration Script: Auto-loop execution until complete
async function runMigration(stateId: string) {
let isComplete = false;
let batchNum = 0;
while (!isComplete) {
const tx = new Transaction();
tx.moveCall({
target: `${MIGRATION_PKG}::market_migration::migrate_batch`,
arguments: [/* ... */, tx.pure.u64(100)], // 100 per batch
});
const result = await client.signAndExecuteTransaction({ signer: adminKeypair, transaction: tx });
console.log(`Batch ${++batchNum} done:`, result.digest);
// Check migration state
const state = await client.getObject({ id: stateId, options: { showContent: true } });
isComplete = (state.data?.content as any)?.fields?.is_complete;
await new Promise(r => setTimeout(r, 1000)); // 1 second interval
}
console.log("Migration complete!");
}
Why is incremental migration better than all-at-once?
Because in real production systems, you typically need to balance simultaneously:
- Computation limits
- Risk control
- Failure recovery
- Service can continue running during migration
The biggest problem with one-time migration is not that it can’t be written, but:
- Hard to recover from mid-failure
- State easily becomes half-old half-new after failure
- Transaction too large simply can’t be sent
22.6 Complete Upgrade Workflow
① Develop new version contract (local + testnet validation)
② Announce upgrade intent (TimeLock starts timer, notify community)
③ Community review period (72 hours)
④ After TimeLock expires, execute sui client upgrade --upgrade-capability <CAP_ID>
⑤ Run data migration scripts (if necessary)
⑥ Update dApp configuration (new Package ID, new interface version)
⑦ Announce upgrade complete
A mature team views upgrades as “controlled release events”
That is, besides on-chain actions themselves, you should also synchronously prepare:
- Upgrade announcement
- Frontend switch plan
- Rollback or downtime contingency
- Post-upgrade observation metrics
Otherwise “on-chain upgrade complete” doesn’t equal “system has stably completed upgrade.”
🔖 Chapter Summary
| Knowledge Point | Core Takeaway |
|---|---|
| Upgrade constraints | Struct/function signatures unchangeable, but can add new functions/modules |
| Dynamic field extension | df::add() adds “future fields” at runtime |
| Versioned API | buy_v1() / buy_v2() coexist, dApp chooses by version |
| TimeLock upgrade | Announcement + waiting period → community review → can execute |
| Incremental migration | migrate_batch() processes in batches, avoid exceeding computation limit |
📚 Further Reading
Chapter 23: Deployment, Maintenance, and Community Collaboration
Objective: Master the complete deployment process from development to production, understand the boundaries and positioning of the Builder ecosystem, and become a sustainably active EVE Frontier builder.
Status: Deployment and operations chapter. Main content focuses on launch process, maintenance, and Builder collaboration.
23.1 Complete Deployment Checklist
From local development to official launch, you need to go through the following phases:
Phase 1 —— Local Development (Localnet)
✅ Docker local chain running
✅ Move build compiles successfully
✅ All unit tests pass
✅ Functional testing (scripts simulate complete flow)
Phase 2 —— Testnet
✅ sui client publish to testnet
✅ Extension registered to test components
✅ dApp deployed to test URL
✅ Invite small group of users for testing
Phase 3 —— Mainnet Launch
✅ Code audit (self-audit + community review)
✅ Backup UpgradeCap to secure address
✅ sui client switch --env mainnet
✅ Publish contract, record Package ID
✅ dApp deployed to official domain
✅ Notify community / update announcements
This checklist itself is fine, but what really needs to be established is a concept:
Deployment is not “moving code from local to chain,” but “switching a real service that will be used and depended upon by people to production state.”
So deployment must cover four tracks simultaneously:
- Contract Track Is the package published correctly, are permissions configured correctly
- Frontend Track Is dApp connected to correct network and objects
- Operations Track Do users know how to use the new version, are old entry points invalid
- Emergency Track Who handles when issues occur, which layer to stop first
23.2 Network Environment Configuration
Sui and EVE Frontier support three networks:
| Network | Purpose | RPC Address |
|---|---|---|
| localnet | Local development, Docker startup | http://127.0.0.1:9000 |
| testnet | Public testing, no real value | https://fullnode.testnet.sui.io:443 |
| mainnet | Official production environment | https://fullnode.mainnet.sui.io:443 |
# Switch to different networks
sui client switch --env testnet
sui client switch --env mainnet
# View current network
sui client envs
sui client active-env
# View account balance
sui client balance
Environment Switching in dApp
// Control which network dApp connects to via environment variables
const RPC_URL = import.meta.env.VITE_SUI_RPC_URL
?? 'https://fullnode.testnet.sui.io:443'
const WORLD_PACKAGE = import.meta.env.VITE_WORLD_PACKAGE
?? '0x...' // testnet package id
const client = new SuiClient({ url: RPC_URL })
The easiest error in environment switching is not the commands themselves, but “half-switching”:
- CLI already switched to mainnet
- Frontend still reading testnet
- Wallet connected to another environment
- Documentation and announcements still have old Package ID
Once this half-switched state appears, superficial symptoms are usually confusing:
- Contract looks successfully published, but frontend doesn’t recognize it
- User wallet can connect, but objects can’t be found
- Works locally for yourself, but not in others’ environments
So what really needs verification is not “something somewhere changed,” but “all entry points point to the same environment.”
23.3 Testnet to Mainnet Considerations
- Package ID will change: New Package ID after Mainnet publish, dApp config needs updating
- Data not universal: Objects created on Testnet (characters, components) don’t exist on Mainnet, need re-initialization
- Gas fees are real: SUI on Mainnet has real value, publishing and operations consume real Gas
- Irreversible: Shared objects (
share_object) cannot be withdrawn
From testnet to mainnet, be most wary of “mental copying”
Many teams subconsciously think:
- Flow worked on testnet
- So mainnet is just “repeating it once”
Actually not. The biggest difference between mainnet and testnet is not just asset value, but:
- User expectations are higher
- Error costs are higher
- Rollback space is smaller
- Community trust is more fragile
So before mainnet launch, you should treat yourself as launching a product, not submitting an assignment.
23.4 Package Upgrade Best Practices
Securely Store UpgradeCap
UpgradeCap is the most sensitive permission object; losing it means you cannot upgrade the contract:
# View your UpgradeCap
sui client objects --json | grep -A5 "UpgradeCap"
Storage Strategy:
- Multisig Address: Transfer UpgradeCap to 2/3 multisig address, prevent single point of failure
- Timelock: Can add timelock mechanism, upgrades require advance announcement
- Burn (extreme case): If contract confirmed to never need upgrades, can burn UpgradeCap to completely guarantee immutability
// Transfer UpgradeCap to multisig address
const tx = new Transaction()
tx.transferObjects(
[tx.object(UPGRADE_CAP_ID)],
tx.pure.address(MULTISIG_ADDRESS)
)
Why is UpgradeCap one of the most dangerous objects post-deployment?
Because it controls not a single transaction, but the entire protocol’s future form.
If poorly managed, two extreme risks occur:
- Stolen Attacker can publish malicious upgrades
- Lost You permanently lose upgrade capability
Both are not ordinary business bugs, but protocol-level incidents.
Version Management
Recommend maintaining version numbers in contracts:
const CURRENT_VERSION: u64 = 2;
public struct VersionedConfig has key {
id: UID,
version: u64,
// ... config fields
}
// Call migration function when upgrading
public fun migrate_v1_to_v2(
config: &mut VersionedConfig,
_cap: &UpgradeCap,
) {
assert!(config.version == 1, EMigrationNotNeeded);
// ... execute data migration
config.version = 2;
}
Version numbers are not decoration
Their real purpose is to help you clearly answer:
- Which semantic version is this object currently in
- Which set of fields should frontend and scripts interpret it by
- Whether migration logic needs to be triggered
Otherwise, once old objects and new logic coexist in production, you’ll quickly fall into the debugging hell of “is the data broken or is the code broken.”
23.5 dApp Deployment and Hosting
Static Deployment (Recommended)
# Build production version
npm run build
# Deploy to Vercel (automatic CI/CD)
vercel --prod
# Or deploy to GitHub Pages
gh-pages -d dist
Recommended Platforms:
| Platform | Features |
|---|---|
| Vercel | Automatic CI/CD, simple config, ample free tier |
| Cloudflare Pages | Global CDN, supports KV storage extensions |
| IPFS/Arweave | Truly decentralized deployment, permanent storage |
Environment Variable Configuration
# .env.production
VITE_SUI_RPC_URL=https://fullnode.mainnet.sui.io:443
VITE_WORLD_PACKAGE=0x_MAINNET_WORLD_PACKAGE_
VITE_MY_PACKAGE=0x_MAINNET_MY_PACKAGE_
VITE_TREASURY_ID=0x_MAINNET_TREASURY_ID_
The most underestimated problem in frontend deployment: cache and old links
After on-chain code is published, the frontend won’t necessarily switch over as you expect. You also need to consider:
- CDN cache
- Browser cache
- Users’ bookmarked old links
- Third-party pages referencing old domains or old parameters
That is, frontend deployment isn’t “upload new package and done,” you also need to ensure users truly enter the new version entry point.
23.6 Builder Positioning and Constraints in EVE Frontier
Understanding Builder boundaries is critical for long-term success:
What You Can Do (Layer 3)
- ✅ Write custom extension logic (Witness pattern)
- ✅ Build new economic mechanisms (markets, auctions, tokens)
- ✅ Create frontend dApp interfaces
- ✅ Add custom rules on existing facility types
- ✅ Compose with other Builders’ contracts
What You Cannot Change (Layer 1 & 2)
- ❌ Modify core game physics rules (location, energy system)
- ❌ Create brand new facility types (only CCP can do)
- ❌ Access unpublished Admin operations
- ❌ Bypass AdminACL’s server verification requirements
Design Technique: Find Space Within Constraints
Official Limitation: Stargates can only control passage via JumpPermit
Your Extension Space:
├── Permit validity period (time control)
├── Permit acquisition conditions (paid/hold NFT/quest completion)
├── Permit secondary market (resell passes)
└── Permit bulk purchase discounts
The correct mindset for understanding constraints is not complaining “why don’t you give me more permissions,” but:
Within fixed world rules, find sufficiently large product design space.
Truly mature Builders often don’t want to change the world’s foundation, but excel at building on existing interfaces:
- Stronger operational mechanisms
- Clearer permission design
- Better user experience
- Higher compositional value
23.7 Community Collaboration and Contribution
Composability: Your Contracts Can Be Used by Others
When you publish a market contract, other Builders can:
- Integrate your price oracle into their pricing systems
- Add referral commissions on top of your market
- Use your token as payment for their services
Design Recommendation: Expose necessary read interfaces to make your contract ecosystem-friendly:
// Expose query interfaces for other contracts to call
public fun get_current_price(market: &Market, item_type_id: u64): u64 {
// Return current price, other contracts can use for pricing reference
}
public fun is_item_available(market: &Market, item_type_id: u64): bool {
table::contains(&market.listings, item_type_id)
}
Contributing to Official Documentation
EVE Frontier documentation is open source:
# Clone documentation repo
git clone https://github.com/evefrontier/builder-documentation.git
# Create branch, add your tutorial or corrections
git checkout -b feat/add-auction-tutorial
# Submit PR
Contribution content includes:
- Find and fix documentation errors
- Supplement missing example code
- Translate documentation to other languages
- Share your best practice cases
Why community collaboration is not “icing on cake”
Because the true compound interest of the Builder ecosystem comes from reuse:
- You expose read interfaces, others can integrate
- Others expose best practices, you avoid many pitfalls
- Once documentation is more accurate, entire ecosystem’s development efficiency rises
Direct returns for yourself too:
- Easier to be integrated
- Easier to build reputation
- Easier to get early users and feedback
Code of Conduct
All Builders must comply:
- ❌ Prohibited to harass or maliciously attack other players via programming infrastructure
- ❌ Prohibited deceptive economic behavior (like honeypot contracts)
- ✅ Encourage fair competition and transparent mechanisms
- ✅ Encourage collective knowledge and tool sharing
23.8 Sustainable Builder Strategy
Economic Sustainability
Revenue Source Design:
├── Transaction fees (1-3% of market trades)
├── Subscription services (monthly LUX subscription)
├── Premium features (paid unlock)
└── Alliance service contracts (B2B)
Cost Control:
├── Use read APIs (GraphQL/gRPC) instead of high-frequency on-chain writes
├── Aggregate multiple operations into single transaction
└── Utilize sponsored transactions to reduce user friction
Technical Sustainability
- Modular Design: Split functionality into independent modules for independent upgrades
- Backward Compatibility: New versions prioritize compatibility with old version data
- Documentation-Driven: Document your own contract APIs for easy integration by others
- Monitoring & Alerting: Subscribe to key events, get notified when anomalies occur
Technical and economic sustainability must be viewed together
Many projects die not because technology can’t be done or no revenue, but because the two sides disconnect:
- Many features, but maintenance costs too high
- Revenue looks decent, but relies entirely on manual operations
- Users can get in, but no retention reason
So a sustainable Builder project typically satisfies simultaneously:
- Pricing model simple and explainable
- Permissions and operations won’t drag you down
- New versions can evolve smoothly
- Someone can respond quickly when critical issues appear
What really should be recorded long-term is not “how many versions released”
But these operational facts:
- How many real users completed key actions
- Which step has highest churn
- Which types of transactions fail most
- Which features almost nobody uses
This data will ultimately decide what you should continue doing and what you shouldn’t.
23.9 Future of EVE Frontier Ecosystem
According to official documentation, the following features may be opened to Builders in the future:
- More Component Types: Programming interfaces for industrial facilities like refineries, manufacturing plants
- Zero-Knowledge Proofs: Use ZK proofs to replace server signatures for proximity verification, achieve full decentralization
- Richer Economic Interfaces: More official LUX/EVE Token interaction interfaces
Design Principle: Design for extensibility. Today’s contracts should be able to seamlessly integrate after tomorrow’s new features launch through upgrades.
The most realistic advice here is not “bet on all future directions,” but:
- First thoroughly master today’s truly implementable component capabilities
- Then leave evolution space for future interface changes
That is, future-readiness shouldn’t come from “writing many conceptual interfaces not yet usable,” but from:
- Object structures not hardcoded
- Config and policy leave upgrade positions
- Frontend doesn’t interpret on-chain fields too rigidly
🔖 Chapter Summary
| Knowledge Point | Core Takeaway |
|---|---|
| Deployment process | localnet → testnet → mainnet three phases |
| Network switching | sui client switch --env mainnet |
| UpgradeCap security | Multisig storage, consider timelock |
| dApp deployment | Vercel/Cloudflare Pages + environment variables |
| Builder constraints | Layer 3 free extension, Layer 1/2 unchangeable |
| Community collaboration | Open APIs, contribute docs, follow code of conduct |
| Sustainability strategy | Diverse revenue + modular + monitoring |
📚 Further Reading
- Builder Constraints Documentation
- Contributing Guide
- Sui Package Upgrades
- [EVE Frontier Development Roadmap (community channels)]
Chapter 24: Troubleshooting Manual (Common Errors & Debugging Methods)
Goal: Systematically organize the most common error types encountered in EVE Frontier Builder development, master efficient debugging workflows, and minimize time spent “stepping on landmines.”
Status: Engineering support chapter. The main text focuses on troubleshooting paths and debugging habits.
24.1 Error Classification Overview
EVE Frontier Development Errors
├── Contract Errors (Move)
│ ├── Compilation errors (build failures)
│ ├── On-chain Abort (runtime failures)
│ └── Logic errors (successful execution but wrong results)
├── Transaction Errors (Sui)
│ ├── Gas issues
│ ├── Object version conflicts
│ └── Permission errors
├── dApp Errors (TypeScript/React)
│ ├── Wallet connection failures
│ ├── On-chain data reading failures
│ └── Parameter construction errors
└── Environment Errors
├── Docker/local node issues
├── Sui CLI configuration issues
└── Missing ENV variables
Truly efficient troubleshooting isn’t about memorizing an error encyclopedia, but about first categorizing the problem to the correct layer.
A very practical approach is to ask first:
- Did it break before compilation, or during on-chain execution?
- Are the objects and permissions wrong, or did the frontend construct parameters incorrectly?
- Is the environment inconsistent, or is there actually a bug in the logic?
As long as you get the first layer classification right, subsequent troubleshooting efficiency will be much higher.
24.2 Move Compilation Errors
Error: unbound module
error[E02001]: unbound module
┌─ sources/my_ext.move:3:5
│
3 │ use world::gate;
│ ^^^^^^^^^^^ Unbound module 'world::gate'
Cause: Missing dependency declaration for the world package in Move.toml.
Solution:
# Move.toml
[dependencies]
World = { git = "https://github.com/evefrontier/world-contracts.git", subdir = "contracts/world", rev = "v0.0.14" }
Error: ability constraint not satisfied
error[E05001]: ability constraint not satisfied
┌─ sources/market.move:42:30
|
42 │ transfer::public_transfer(listing, recipient);
| ^^^^^^^ Missing 'store' ability
Cause: The Listing struct is missing the store ability and cannot be used with public_transfer.
Solution:
// Add required ability
public struct Listing has key, store { ... }
// ^^^^^
Error: unused variable / unused let binding
warning[W09001]: unused let binding
= 'receipt' is bound but not used
Solution: Use underscore to ignore, or confirm if a return step (Borrow-Use-Return pattern) is missing:
let (_receipt) = character::borrow_owner_cap(...); // Temporarily ignore
// Better practice: confirm return
character::return_owner_cap(own_cap, receipt);
Most Useful Habit for Compilation Errors
It’s not about copying and pasting errors to search, but immediately determining which category it belongs to:
- Dependency resolution issues
unbound module - Type / ability issues
ability constraint not satisfied - Resource lifecycle issues
unused let binding, unconsumed value, borrowing conflicts
Move compiler errors are often already very close to the real cause, as long as you don’t treat them as pure noise.
24.3 On-chain Abort Error Interpretation
On-chain Aborts return in the following format:
MoveAbort(MoveLocation { module: ModuleId { address: 0x..., name: Identifier("toll_gate_ext") }, function: 2, instruction: 6, function_name: Some("pay_toll") }, 1)
Key information: function_name + abort code (the number at the end).
Common Abort Code Reference Table
| Error Code | Typical Meaning | Investigation Direction |
|---|---|---|
0 | Insufficient permissions (assert!(ctx.sender() == owner)) | Check caller address vs owner stored in contract |
1 | Insufficient balance/quantity | Check coin::value() vs required amount |
2 | Object already exists (table::add duplicate key) | Check if already registered/purchased |
3 | Object does not exist (table::borrow not found) | Check if key is correct |
4 | Time validation failed (expired / not yet valid) | Compare clock.timestamp_ms() with contract logic |
5 | Incorrect state (e.g., already settled, not started) | Check state fields like is_settled, is_online |
Quick Locate Abort Source
# Search for error code in source code
grep -n "assert!.*4\b\|abort.*4\b\|= 4;" sources/*.move
When Encountering an Abort, First Reaction Shouldn’t Be “Contract is Broken”
A more stable order is usually:
- First look at
function_name - Then look at abort code
- Then compare against the objects, addresses, amounts, and time parameters passed in at that time
Many Aborts are not actually code bugs, but:
- Used the wrong object
- Current state doesn’t meet preconditions
- Frontend assembled expired or incomplete parameters
24.4 Gas Related Issues
InsufficientGas (Gas Exhausted)
TransactionExecutionError: InsufficientGas
Solution: Step-by-step Investigation
// 1. First dryRun to estimate Gas
const estimate = await client.dryRunTransactionBlock({
transactionBlock: await tx.build({ client }),
});
console.log("Gas estimate:", estimate.effects.gasUsed);
// 2. Set sufficient Gas Budget in actual transaction (+20% buffer)
const gasUsed = Number(estimate.effects.gasUsed.computationCost)
+ Number(estimate.effects.gasUsed.storageCost);
tx.setGasBudget(Math.ceil(gasUsed * 1.2));
GasBudgetTooHigh
Your Gas Budget exceeds your account balance:
// Query account SUI balance
const balance = await client.getBalance({ owner: address, coinType: "0x2::sui::SUI" });
const maxBudget = Number(balance.totalBalance) * 0.5; // Use max 50% of balance for Gas
tx.setGasBudget(Math.min(desired_budget, maxBudget));
Gas Issues Are Easily Misdiagnosed as “Wallet Has No Money”
In reality, there are three common causes:
- Actually no money
- Gas budget set too conservatively
- The transaction model itself is too heavy
If you only know how to keep increasing the budget without looking at the cost structure in dry run results, you’ll usually just mask structural problems.
24.5 Object Version Conflicts
TransactionExecutionError: ObjectVersionUnavailableForConsumption
Cause: Your code holds an old version object reference, but it has been modified by another transaction on-chain.
Common scenario: Simultaneously initiating multiple transactions using the same shared object (like Market).
Solution:
// ❌ Wrong: Initiating multiple transactions using the same shared object in parallel
await Promise.all([buyTx1, buyTx2])
// ✅ Correct: Execute sequentially
for (const tx of [buyTx1, buyTx2]) {
await client.signAndExecuteTransaction({ transaction: tx })
// Wait for confirmation before submitting the next one
}
Version Conflicts Essentially Remind You: Objects Are Alive
As long as multiple transactions need to write to the same object, you must assume it may have already changed before you submit again.
So this type of problem is often not “sporadic and mysterious,” but the system design telling you:
- There’s a shared hotspot here
- Serialization or object version refresh is needed here
- Sharding or object splitting may need to be reconsidered here
24.6 dApp Wallet Connection Issues
EVE Vault Not Detected
WalletNotFoundError: No wallet found
Investigation checklist:
- ✅ Is EVE Vault browser extension installed and enabled?
- ✅ Is
VITE_SUI_NETWORKconsistent with Vault’s current network (testnet/mainnet)? - ✅ Is
@evefrontier/dapp-kitversion compatible with Vault version?
// List all detected wallets (for debugging)
import { getWallets } from "@mysten/wallet-standard";
const wallets = getWallets();
console.log("Detected wallets:", wallets.get().map(w => w.name));
Signature Request Silently Rejected (No Popup)
Cause: Vault may be in locked state.
Solution: Check wallet status before initiating signature:
const { currentAccount } = useCurrentAccount();
if (!currentAccount) {
// Guide user to connect wallet instead of directly initiating signature
showConnectModal();
return;
}
Wallet Issue Investigation Order
The most stable order is usually:
- Is the wallet detected
- Is the current account connected
- Is the network correct
- Are the objects and permissions available for the current account
Don’t immediately suspect Vault itself when seeing signature failures. Many issues are actually frontend state, network, and object context misalignment.
24.7 On-chain Data Reading Issues
getObject Returns null
const obj = await client.getObject({ id: "0x...", options: { showContent: true } });
if (!obj.data) {
// Object doesn't exist, or ID is wrong
console.error("Object doesn't exist, check if ID is correct (may be testnet/mainnet confusion)");
}
Common causes:
- Used testnet Object ID to query mainnet (or vice versa)
- Object has been deleted (contract called
id.delete()) - Typo
showContent: true but content.fields is Empty
const content = obj.data?.content;
if (content?.dataType !== "moveObject") {
// This is a package object, not a Move object
console.error("Object is not a MoveObject, ID may point to a Package");
}
When Unable to Read Data, Prioritize Checking These Four Things
- Is the ID from the correct network
- Is this ID an object or a package
- Has the object been deleted or migrated
- Is the frontend parsing path consistent with the actual field structure
Many “can’t read” problems aren’t because the node is broken, but because you queried the wrong object.
24.8 Local Development Environment Issues
Docker Local Chain Startup Failed
# View container logs
docker compose logs -f
# Common cause: Port occupied
lsof -i :9000
kill -9 <PID>
# Reset local chain state (clear all data and restart)
docker compose down -v
docker compose up -d
sui client publish Failed
# Error: Package verification failed
# Cause: Dependent world-contracts address inconsistent with local node
# In Move.toml, confirm using localnet package address for local testing
[addresses]
world = "0x_LOCAL_WORLD_ADDRESS_" # Obtain from local chain deployment results
Contract Cannot Be Called After Deployment (Function Not Found)
# Check if published package ID matches ENV configuration
echo $VITE_WORLD_PACKAGE
# Verify on-chain package contains expected function
sui client object 0x_PACKAGE_ID_ --json | jq '.content.disassembled'
Environment Issues Fear “Half-Right” Most
Meaning:
- Local chain is good
- CLI can also connect
- But some address, dependency or ENV is still on another environment
This type of problem is annoying because on the surface each layer “looks fine.” So whenever encountering environment-type issues, it’s best to print:
- Current network
- Current address
- Current package ID
- Current ENV configuration
All at once is much faster than guessing one by one.
24.9 Debugging Workflow: Systematic Investigation
When encountering problems, investigate in the following order:
1. Read error message (don't ignore any details)
├── Is it a Move abort? → Find abort code → Check contract source
├── Is it a Gas issue? → dryRun estimate → Adjust budget
└── Is it a TypeScript error? → console.log parameters at each step
2. Isolate the problem
├── Call contract directly using Sui Explorer (bypass dApp)
├── Write Move unit tests to reproduce the problem
└── Test GraphQL queries using curl/Postman
3. Align with community
├── Search Discord #builders channel
├── Paste complete error message (including Transaction Digest)
└── Provide minimal reproducible code
A More Practical Investigation Mindset
Each time, try to reduce the problem to the minimum:
- Fewest objects
- Minimum single operation
- Shortest call chain
Because once an on-chain system involves frontend, backend, wallet, indexer, and game server, problems will rapidly expand. Reduce first, then locate - highest efficiency.
24.10 Common Debugging Tools
| Tool | Purpose | Link |
|---|---|---|
| Sui Explorer | View transaction details, object state | https://suiexplorer.com |
| Sui GraphQL IDE | Manually test GraphQL queries | https://graphql.testnet.sui.io |
| Move Prover | Formal verification of contract properties | sui move prove |
| dryRun | Gas estimation and simulation execution | client.dryRunTransactionBlock() |
| sui client call | Call contract directly from command line | sui client call --help |
🔖 Chapter Summary
| Error Type | Fastest Investigation Path |
|---|---|
| Move compilation errors | Check Move.toml dependencies + ability declarations |
| Abort (code N) | grep abort code in contract source, quick lookup table |
| Gas exhausted | dryRun() estimate + set 20% buffer |
| Object version conflict | Sequential execution instead of concurrent, wait for each confirm |
| Wallet not detected | Check extension installation, network consistency, version compatibility |
| Object read returns empty | Confirm network environment (testnet vs mainnet) |
| Local chain issues | docker compose logs + reset data volume |
Chapter 25: From Builder to Product — Commercialization Paths and Ecosystem Operations
Goal: Go beyond the technical layer to understand how to transform your EVE Frontier contracts and dApps into real products with users, revenue, and community, and how to find your position in this emerging ecosystem.
Status: Product chapter. Main text focuses on business models, growth, and operational mechanisms.
25.1 Four Business Models for Builders
In the EVE Frontier ecosystem, Builders have four main value capture methods:
┌─────────────────────────────────────────────────────────┐
│ Builder Business Model Spectrum │
├─────────────────┬───────────────────────┬──────────────┤
│ Model │ Representative Cases │ Revenue Source│
├─────────────────┼──────────────────────┼──────────────┤
│ Infrastructure │ Stargate tolls, │ Usage fees │
│ Infrastructure │ Storage markets, │ (automatic) │
│ │ General auction │ │
├─────────────────┼──────────────────────┼──────────────┤
│ Token Economy │ Alliance Token + DAO │ Token │
│ Token Economy │ Points system │ appreciation, │
│ │ │ tax │
├─────────────────┼──────────────────────┼──────────────┤
│ Platform/SaaS │ Multi-tenant market │ Platform fee │
│ Platform │ framework, │ Monthly/ │
│ │ Competition system │ registration │
├─────────────────┼──────────────────────┼──────────────┤
│ Data Services │ Leaderboards, │ Ads/ │
│ Data & Tools │ Analytics dashboards, │ subscription, │
│ │ Price aggregators │ Premium │
└─────────────────┴──────────────────────┴──────────────┘
The most important thing about this chart isn’t helping you “choose a track name,” but to see clearly:
What exactly are you selling - assets, traffic, protocol capabilities, or information advantage.
Many Builder projects fail not because of poor technology, but because they never clearly thought through what they’re selling from the start.
25.2 Pricing Strategy: On-chain Automatic Revenue
The simplest Builder revenue: automatic commission on transactions, zero operational cost.
Two-tier Fee Structure
// Settlement: platform fee + builder fee dual structure
public fun settle_sale(
market: &mut Market,
sale_price: u64,
mut payment: Coin<SUI>,
ctx: &mut TxContext,
): Coin<SUI> {
// 1. Platform protocol fee (EVE Frontier official, if any)
let protocol_fee = sale_price * market.protocol_fee_bps / 10_000;
// 2. Your Builder fee
let builder_fee = sale_price * market.builder_fee_bps / 10_000; // e.g.: 200 = 2%
// 3. Remainder to seller
let seller_amount = sale_price - protocol_fee - builder_fee;
// Distribution
transfer::public_transfer(payment.split(builder_fee, ctx), market.fee_recipient);
// ... protocol fee to official address, remainder to seller
payment // Return seller_amount
}
Fee Range Recommendations
| Type | Suggested Range | Notes |
|---|---|---|
| Stargate toll | 5-50 SUI/time | Fixed fee, reflects scarcity |
| Market commission | 1-3% | Benchmark traditional markets |
| Auction platform fee | 2-5% | For matchmaking services provided |
| Multi-tenant platform monthly fee | 10-100 SUI | Other Builders using your framework |
Why Automatic Commission Looks Best But Is Also Easiest to Overestimate
Its advantages are obvious:
- Revenue automation
- No manual billing
- Directly tied to actual usage
But it also has prerequisites:
- Users are actually willing to continue using your facilities
- Your fees won’t be instantly undercut by cheaper alternatives
- Your service has clear differentiation, not pure commodity channels
So on-chain automatic revenue isn’t “write a contract and money comes” - it just executes the business model more cleanly.
25.3 User Acquisition: In-game Touchpoints
Main paths for players to discover your dApp:
Touchpoint Priority:
1. In-game display (highest conversion)
└── Player approaches your stargate/turret → In-game overlay pops up → Direct interaction
2. EVE Frontier official Builder directory (expected feature)
└── Official lists certified Builder services → Player actively searches
3. Player community (Discord / Reddit)
└── Word of mouth → Alliance recommendations → User growth
4. In-alliance promotion
└── Partner with major alliances → Embed in their toolchain → Bulk users
Growth Flywheel Design
Players use service
↓
Receive rewards (tokens/NFT/privileges)
↓
Value is visible and tradable
↓
Show off/sell to other players
↓
More players learn about and join
↓
(Back to top)
Most Easily Overlooked Point in User Acquisition
It’s not “how to get more people to click the first time,” but “why users stay after clicking.”
Especially in EVE scenarios, many features naturally have strong context:
- Will use when needed right now
- Will immediately leave when not needed
So what really needs to be designed is:
- Is first use smooth enough
- Will they come back a second time
- Will it form alliance-level or group-level dependency
25.4 Community Building: Builder’s Moat
In EVE Frontier, community is your most uncopyable asset. Technology can be copied, but relationships cannot.
Community Building Levels
1. Discord Server
├── #announcements (version updates, new features)
├── #support (user Q&A)
├── #feedback (collect opinions)
└── #governance (important decision voting)
2. Regular Communication
├── Monthly AMA (Ask Me Anything)
├── Transparent financial reports (show Treasury balance and dividend plans)
└── Public roadmap updates
3. Community Incentives
├── Early user NFT badges (see Example 8)
├── Feedback rewards (report bugs get tokens)
└── Referral rewards (bring new users to register alliance)
The true value of community isn’t in numbers, but in relationship strength and feedback quality.
A small but active Builder community is often more useful than a large but silent channel because it provides:
- Real demand feedback
- Problem reproduction samples
- First wave of evangelists
- Early co-builders
25.5 Transparency: Trustworthy On-chain Operations
On-chain data is naturally transparent - turn it into a competitive advantage:
// Generate monthly public financial report
async function generateMonthlyReport(treasuryId: string) {
const treasury = await client.getObject({
id: treasuryId,
options: { showContent: true },
});
const fields = (treasury.data?.content as any)?.fields;
const events = await client.queryEvents({
query: { MoveEventType: `${PKG}::treasury::FeeCollected` },
// Filter for this month's time range...
});
const totalCollected = events.data.reduce(
(sum, e) => sum + Number((e.parsedJson as any).amount), 0
);
return {
date: new Date().toISOString().slice(0, 7), // "2026-03"
totalRevenueSUI: totalCollected / 1e9,
currentBalanceSUI: Number(fields.balance) / 1e9,
totalUserTransactions: events.data.length,
topServices: calculateTopServices(events.data),
};
}
Transparency isn’t just “making data public” - it’s making users understand:
- Where money comes from
- Where money goes
- Which rules are fixed
- Which adjustments were changed later
As long as these can be understood by users, your protocol’s trust cost will significantly decrease.
25.6 Compliance and Risk Management
Although EVE Frontier is decentralized, Builders still need to be aware:
Technical Risks
| Risk | Mitigation Measures |
|---|---|
| Contract vulnerabilities causing asset loss | Pre-launch audit; TimeLock upgrades; Set single transaction limits |
| Package upgrades breaking users | Versioned API; Announcement period; Migration subsidies |
| Sui network failures | Manage user expectations well; Set time-based protections |
| Dependent World Contracts upgrades | Follow official changelog; Testnet validation |
Community Risks
| Risk | Mitigation Measures |
|---|---|
| User churn | Continuously deliver value; Listen to feedback |
| Competitor copying | Accelerate iteration; Build user relationship moat |
| Negative sentiment | Quick public response; Transparent communication |
Most Important in Risk Management Is Actually “Contingency Plans”
Not thinking about what to say when something happens, but knowing in advance:
- Which layer to stop first during vulnerabilities
- Whether frontend needs to hide certain entry points first
- Whether sponsored services need to be paused
- Which states and assets need protection first
25.7 Long-term Sustainability: Progressive Decentralization
The healthiest Builder projects should move toward progressive decentralization:
Stage 1 (Launch): Builder centralized control
• Rapid iteration, flexible adjustments
• Build initial user base and cash flow
Stage 2 (Growth): Introduce community governance
• Important parameters (fees, new features) DAO voting
• Token holders gain proposal rights
Stage 3 (Maturity): Full community autonomy
• All key decisions via on-chain governance
• Builder transitions to contributor role
• Protocol revenue fully distributed to token holders
The most important restraint here is: not every project must reach “full community autonomy.”
A more realistic question should be:
- Does this project really need governance tokens
- Is the community mature enough to bear governance responsibility
- Which powers are suitable for delegation, which should remain at execution layer
25.8 EVE Frontier Ecosystem Collaboration Opportunities
Don’t go it alone - seek synergies:
Horizontal collaboration (similar Builders):
├── Share technical standards (interface protocols)
├── Joint marketing
└── Mutual user referrals (your users → my service)
Vertical collaboration (different tier Builders):
├── Infrastructure Builders provide APIs
├── Application Builders build on top
└── User experience Builders create portal aggregations
Collaborate with CCP:
├── Apply for official Featured Builder certification
├── Participate in official testing and feedback projects
└── Showcase your tools at official events
True value of collaboration usually has three types:
- Distribution Let more people know about you faster
- Complementarity Don’t have to build the full stack from scratch yourself
- Legitimacy Make users more confident using your service
25.9 Core Traits of Successful Builders
From technology to product, you need more than just Move code:
Technical Capabilities (you already have) Strategic Capabilities (equally important)
───────────────────────── ─────────────────────────────
✅ Move contract development ✅ User needs insight
✅ Full-stack dApp development ✅ Rapid product iteration
✅ Security & testing ✅ Community building & communication
✅ Performance optimization ✅ Business model design
✅ Upgrades & maintenance ✅ Competitive analysis & differentiation
Truly long-term successful Builders are usually not “the best code writers,” but those who can hold technology, product, community and timing together.
25.10 Your Builder Journey Roadmap
Month 0-1 (Learning):
├── Complete all chapters and examples in this course
├── Deploy Example 1-2 on testnet
└── Join Builder Discord, meet the community
Month 1-3 (Experimentation):
├── Release testnet version of first product
├── Invite test users, collect feedback
└── Iterate 2-3 rounds
Month 3-6 (Validation):
├── Mainnet launch (small scale, cautious testing)
├── Achieve first on-chain revenue
└── Build initial community (Discord 100+ members)
Month 6-12 (Growth):
├── Monthly active users 1000+
├── Introduce token economy (if suitable)
└── Establish first cross-Builder collaboration
Year 2+ (Ecosystem):
├── Become "infrastructure" in the ecosystem
├── Progressive community governance
└── Sustainable self-operation
This roadmap should be treated as a “stage judgment framework” rather than a KPI checklist.
Because different products will have very different paces, but one judgment always holds:
Prove people are actually using it first, then scale; prove the model works first, then complexify.
🔖 Chapter Summary
| Dimension | Key Points |
|---|---|
| Business models | Four models: infrastructure/token/platform/data |
| Pricing strategy | On-chain automatic commission, zero operational cost |
| User acquisition | In-game touchpoints first, community word-of-mouth second |
| Community building | Discord + transparent reports + incentive mechanisms |
| Risk management | Technical audit + upgrade time locks + rapid response |
| Long-term sustainability | Progressive decentralization, eventual community autonomy |
🎓 Course Complete! You Are Now an EVE Frontier Builder
Congratulations on completing all 23 chapters + 10 practical examples of this course.
You now have mastered:
- ✅ Move smart contracts from beginner to advanced
- ✅ Complete development and deployment of four smart component types
- ✅ Full-stack dApp development and production-grade architecture
- ✅ On-chain economics, NFTs, DAO governance design
- ✅ Security audits, performance optimization, upgrade strategies
- ✅ Commercialization paths and ecosystem operations
In this universe, code is the laws of physics. Go build your universe. 🚀
📚 Bookmark These Resources
| Resource | Purpose |
|---|---|
| EVE Frontier Official Site | Latest official announcements |
| builder-documentation | Official technical documentation |
| world-contracts | World contract source code |
| builder-scaffold | Project scaffold |
| Sui Documentation | Sui blockchain documentation |
| Move Book | Move language reference |
| EVE Frontier Discord | Builder community |
| Sui GraphQL IDE | On-chain data queries |
Example 5: Alliance Token & Automatic Dividend System
Goal: Issue alliance-specific Coin (
ALLY Token), build an automatic dividend contract - alliance facility revenue automatically distributed to token holders by holding ratio - with governance panel dApp.
Status: Teaching example. Repository includes alliance token, treasury, and governance source code, focusing on understanding how capital flow and governance flow coexist.
Code Directory
Minimal Call Chain
Issue ALLY Token -> Revenue flows to treasury -> Distribute dividends by holdings -> Propose -> Members vote
Requirements Analysis
Scenario: Your alliance operates multiple gate toll stations and storage box markets, with revenue from multiple channels. You want:
- 💎 Issue
ALLY Token(total supply 1,000,000), distributed to alliance members by contribution - 🏦 All facility revenue flows to alliance treasury (Treasury)
- 💸 Members holding
ALLY Tokenreceive periodic dividends based on holding ratio - 🗳 Token holders can vote on major alliance decisions (like fee adjustments)
- 📊 Governance panel displays treasury balance, dividend history, proposal list
Part 1: Alliance Token Contract
module ally_dao::ally_token;
use sui::coin::{Self, Coin, TreasuryCap, CoinMetadata};
use sui::transfer;
use sui::tx_context::TxContext;
/// One-Time Witness
public struct ALLY_TOKEN has drop {}
fun init(witness: ALLY_TOKEN, ctx: &mut TxContext) {
let (treasury_cap, coin_metadata) = coin::create_currency(
witness,
6, // Decimals: 6 decimal places
b"ALLY", // Symbol
b"Alliance Token", // Name
b"Governance and dividend token for Alliance X",
option::none(),
ctx,
);
// TreasuryCap given to alliance DAO contract (via address or multisig)
transfer::public_transfer(treasury_cap, ctx.sender());
transfer::public_freeze_object(coin_metadata); // Metadata immutable
}
/// Mint (controlled by DAO contract, not directly exposed to public)
public fun internal_mint(
treasury: &mut TreasuryCap<ALLY_TOKEN>,
amount: u64,
recipient: address,
ctx: &mut TxContext,
) {
let coin = coin::mint(treasury, amount, ctx);
transfer::public_transfer(coin, recipient);
}
Part 2: DAO Treasury & Dividend Contract
module ally_dao::treasury;
use ally_dao::ally_token::ALLY_TOKEN;
use sui::coin::{Self, Coin, TreasuryCap};
use sui::balance::{Self, Balance};
use sui::object::{Self, UID, ID};
use sui::table::{Self, Table};
use sui::event;
use sui::transfer;
use sui::tx_context::TxContext;
use sui::sui::SUI;
// ── Data Structures ──────────────────────────────────────────────
/// Alliance Treasury
public struct AllianceTreasury has key {
id: UID,
sui_balance: Balance<SUI>, // SUI awaiting distribution
total_distributed: u64, // Total historical dividends distributed
distribution_index: u64, // Current dividend round
total_ally_supply: u64, // Current ALLY Token circulating supply
}
/// Dividend claim voucher (records which round each holder claimed to)
public struct DividendClaim has key, store {
id: UID,
holder: address,
last_claimed_index: u64,
}
/// Proposal (governance)
public struct Proposal has key {
id: UID,
proposer: address,
description: vector<u8>,
vote_yes: u64, // Yes votes (ALLY Token quantity weighted)
vote_no: u64, // No votes
deadline_ms: u64,
executed: bool,
}
/// Dividend snapshot (create one per distribution)
public struct DividendSnapshot has store {
amount_per_token: u64, // SUI amount per ALLY Token (in minimum precision)
total_supply_at_snapshot: u64,
}
// ── Events ──────────────────────────────────────────────────
public struct DividendDistributed has copy, drop {
treasury_id: ID,
total_amount: u64,
per_token_amount: u64,
distribution_index: u64,
}
public struct DividendClaimed has copy, drop {
holder: address,
amount: u64,
rounds: u64,
}
// ── Initialization ────────────────────────────────────────────────
public fun create_treasury(
total_ally_supply: u64,
ctx: &mut TxContext,
) {
let treasury = AllianceTreasury {
id: object::new(ctx),
sui_balance: balance::zero(),
total_distributed: 0,
distribution_index: 0,
total_ally_supply,
};
transfer::share_object(treasury);
}
// ── Deposit Revenue ──────────────────────────────────────────────
/// Any contract (gate, market, etc.) can deposit revenue to treasury
public fun deposit_revenue(treasury: &mut AllianceTreasury, coin: Coin<SUI>) {
balance::join(&mut treasury.sui_balance, coin::into_balance(coin));
}
// ── Trigger Distribution ──────────────────────────────────────────
/// Admin triggers: prepare dividend distribution from current treasury balance by ratio
/// Need to store snapshot for each round
public fun trigger_distribution(
treasury: &mut AllianceTreasury,
ctx: &TxContext,
) {
let total = balance::value(&treasury.sui_balance);
assert!(total > 0, ENoBalance);
assert!(treasury.total_ally_supply > 0, ENoSupply);
// Amount per token (in minimum precision, multiply by 1e6 to avoid precision loss)
let per_token_scaled = total * 1_000_000 / treasury.total_ally_supply;
treasury.distribution_index = treasury.distribution_index + 1;
treasury.total_distributed = treasury.total_distributed + total;
// Store snapshot to dynamic field
sui::dynamic_field::add(
&mut treasury.id,
treasury.distribution_index,
DividendSnapshot {
amount_per_token: per_token_scaled,
total_supply_at_snapshot: treasury.total_ally_supply,
}
);
event::emit(DividendDistributed {
treasury_id: object::id(treasury),
total_amount: total,
per_token_amount: per_token_scaled,
distribution_index: treasury.distribution_index,
});
}
// ── Holder Claims Dividends ────────────────────────────────────────
/// Holder provides their ALLY Token (not consumed, only read quantity) to claim dividends
public fun claim_dividends(
treasury: &mut AllianceTreasury,
ally_coin: &Coin<ALLY_TOKEN>, // Holder's ALLY Token (read-only)
claim_record: &mut DividendClaim,
ctx: &mut TxContext,
) {
assert!(claim_record.holder == ctx.sender(), ENotHolder);
let holder_balance = coin::value(ally_coin);
assert!(holder_balance > 0, ENoAllyTokens);
let from_index = claim_record.last_claimed_index + 1;
let to_index = treasury.distribution_index;
assert!(from_index <= to_index, ENothingToClaim);
let mut total_claim: u64 = 0;
let mut i = from_index;
while (i <= to_index) {
let snapshot: &DividendSnapshot = sui::dynamic_field::borrow(
&treasury.id, i
);
// Calculate by holding ratio (reverse scaling)
total_claim = total_claim + (holder_balance * snapshot.amount_per_token / 1_000_000);
i = i + 1;
};
assert!(total_claim > 0, ENothingToClaim);
claim_record.last_claimed_index = to_index;
let payout = sui::coin::take(&mut treasury.sui_balance, total_claim, ctx);
transfer::public_transfer(payout, ctx.sender());
event::emit(DividendClaimed {
holder: ctx.sender(),
amount: total_claim,
rounds: to_index - from_index + 1,
});
}
/// Create claim record (each holder creates once)
public fun create_claim_record(ctx: &mut TxContext) {
let record = DividendClaim {
id: object::new(ctx),
holder: ctx.sender(),
last_claimed_index: 0,
};
transfer::transfer(record, ctx.sender());
}
const ENoBalance: u64 = 0;
const ENoSupply: u64 = 1;
const ENotHolder: u64 = 2;
const ENoAllyTokens: u64 = 3;
const ENothingToClaim: u64 = 4;
Part 3: Governance Voting Contract
module ally_dao::governance;
use ally_dao::ally_token::ALLY_TOKEN;
use sui::coin::Coin;
use sui::object::{Self, UID};
use sui::clock::Clock;
use sui::transfer;
use sui::event;
public struct Proposal has key {
id: UID,
proposer: address,
description: vector<u8>,
vote_yes: u64,
vote_no: u64,
deadline_ms: u64,
executed: bool,
}
/// Create proposal (requires holding at least 1000 ALLY Token)
public fun create_proposal(
ally_coin: &Coin<ALLY_TOKEN>,
description: vector<u8>,
voting_duration_ms: u64,
clock: &Clock,
ctx: &mut TxContext,
) {
// Must hold enough tokens to propose
assert!(sui::coin::value(ally_coin) >= 1_000_000_000, EInsufficientToken); // 1000 ALLY
let proposal = Proposal {
id: object::new(ctx),
proposer: ctx.sender(),
description,
vote_yes: 0,
vote_no: 0,
deadline_ms: clock.timestamp_ms() + voting_duration_ms,
executed: false,
};
transfer::share_object(proposal);
}
/// Vote (weighted by ALLY Token quantity)
public fun vote(
proposal: &mut Proposal,
ally_coin: &Coin<ALLY_TOKEN>,
support: bool,
clock: &Clock,
_ctx: &TxContext,
) {
assert!(clock.timestamp_ms() < proposal.deadline_ms, EVotingEnded);
let weight = sui::coin::value(ally_coin);
if support {
proposal.vote_yes = proposal.vote_yes + weight;
} else {
proposal.vote_no = proposal.vote_no + weight;
};
}
const EInsufficientToken: u64 = 0;
const EVotingEnded: u64 = 1;
Part 4: Governance Dashboard dApp
// src/GovernanceDashboard.tsx
import { useState, useEffect } from 'react'
import { useConnection, getObjectWithJson, executeGraphQLQuery } from '@evefrontier/dapp-kit'
import { useDAppKit } from '@mysten/dapp-kit-react'
import { Transaction } from '@mysten/sui/transactions'
const DAO_PACKAGE = "0x_DAO_PACKAGE_"
const TREASURY_ID = "0x_TREASURY_ID_"
interface TreasuryInfo {
sui_balance: string
total_distributed: string
distribution_index: string
total_ally_supply: string
}
interface Proposal {
id: string
description: string
vote_yes: string
vote_no: string
deadline_ms: string
executed: boolean
}
export function GovernanceDashboard() {
const { isConnected, handleConnect, currentAddress } = useConnection()
const dAppKit = useDAppKit()
const [treasury, setTreasury] = useState<TreasuryInfo | null>(null)
const [proposals, setProposals] = useState<Proposal[]>([])
const [allyBalance, setAllyBalance] = useState<number>(0)
const [claimRecordId, setClaimRecordId] = useState<string | null>(null)
const [status, setStatus] = useState('')
// Load treasury data
useEffect(() => {
getObjectWithJson(TREASURY_ID).then(obj => {
if (obj?.content?.dataType === 'moveObject') {
setTreasury(obj.content.fields as TreasuryInfo)
}
})
}, [])
// Claim dividends
const claimDividends = async () => {
if (!claimRecordId) {
setStatus('⚠️ Please create claim record first')
return
}
const tx = new Transaction()
tx.moveCall({
target: `${DAO_PACKAGE}::treasury::claim_dividends`,
arguments: [
tx.object(TREASURY_ID),
tx.object('ALLY_COIN_ID'), // User's ALLY Coin object ID
tx.object(claimRecordId),
],
})
try {
const r = await dAppKit.signAndExecuteTransaction({ transaction: tx })
setStatus(`✅ Dividends claimed! ${r.digest.slice(0, 12)}...`)
} catch (e: any) {
setStatus(`❌ ${e.message}`)
}
}
// Vote
const vote = async (proposalId: string, support: boolean) => {
const tx = new Transaction()
tx.moveCall({
target: `${DAO_PACKAGE}::governance::vote`,
arguments: [
tx.object(proposalId),
tx.object('ALLY_COIN_ID'), // User's ALLY Coin object ID
tx.pure.bool(support),
tx.object('0x6'), // Clock
],
})
try {
await dAppKit.signAndExecuteTransaction({ transaction: tx })
setStatus(`✅ Vote successful`)
} catch (e: any) {
setStatus(`❌ ${e.message}`)
}
}
return (
<div className="governance-dashboard">
<header>
<h1>🏛 Alliance DAO Governance Center</h1>
{!isConnected
? <button onClick={handleConnect}>Connect Wallet</button>
: <span>✅ {currentAddress?.slice(0, 8)}...</span>
}
</header>
{/* Treasury Status */}
<section className="treasury-panel">
<h2>💰 Alliance Treasury</h2>
<div className="stats-grid">
<div className="stat">
<span className="label">Current Balance</span>
<span className="value">
{((Number(treasury?.sui_balance ?? 0)) / 1e9).toFixed(2)} SUI
</span>
</div>
<div className="stat">
<span className="label">Total Distributed</span>
<span className="value">
{((Number(treasury?.total_distributed ?? 0)) / 1e9).toFixed(2)} SUI
</span>
</div>
<div className="stat">
<span className="label">Distribution Rounds</span>
<span className="value">{treasury?.distribution_index ?? '-'}</span>
</div>
<div className="stat">
<span className="label">Your ALLY Holdings</span>
<span className="value">{(allyBalance / 1e6).toFixed(2)} ALLY</span>
</div>
</div>
<button className="claim-btn" onClick={claimDividends} disabled={!isConnected}>
💸 Claim Pending Dividends
</button>
</section>
{/* Governance Proposals */}
<section className="proposals-panel">
<h2>🗳 Current Proposals</h2>
{proposals.length === 0
? <p>No active proposals</p>
: proposals.map(p => {
const total = Number(p.vote_yes) + Number(p.vote_no)
const yesPct = total > 0 ? Math.round(Number(p.vote_yes) * 100 / total) : 0
const expired = Date.now() > Number(p.deadline_ms)
return (
<div key={p.id} className="proposal-card">
<p className="proposal-desc">{p.description}</p>
<div className="vote-bar">
<div className="yes-bar" style={{ width: `${yesPct}%` }} />
</div>
<div className="vote-stats">
<span>✅ {(Number(p.vote_yes) / 1e6).toFixed(0)} ALLY</span>
<span>❌ {(Number(p.vote_no) / 1e6).toFixed(0)} ALLY</span>
</div>
{!expired && !p.executed && (
<div className="vote-actions">
<button onClick={() => vote(p.id, true)}>👍 Support</button>
<button onClick={() => vote(p.id, false)}>👎 Oppose</button>
</div>
)}
{expired && <span className="badge">Voting Ended</span>}
</div>
)
})
}
</section>
{status && <div className="status-bar">{status}</div>}
</div>
)
}
🎯 Complete Review
Move Contract Layer
├── ally_token.move → Issue ALLY_TOKEN (total supply controlled by TreasuryCap)
├── treasury.move
│ ├── AllianceTreasury → Shared treasury object, receives multi-channel revenue
│ ├── DividendClaim → Holder's claim voucher (records claimed rounds)
│ ├── deposit_revenue() ← Gate/market contracts call
│ ├── trigger_distribution() ← Admin triggers, prepares dividends by snapshot
│ └── claim_dividends() ← Holders self-claim
└── governance.move
├── Proposal → Governance proposal shared object
├── create_proposal() ← Must hold 1000+ ALLY to propose
└── vote() ← ALLY holdings weighted voting
Integration with Other Facilities
└── In example-02's toll_gate.move call
treasury::deposit_revenue(alliance_treasury, fee_coin)
→ Gate toll goes directly to alliance treasury
dApp Layer
└── GovernanceDashboard.tsx
├── Treasury balance and dividend history stats
├── One-click claim dividends
└── Proposal list + voting
🔧 Extension Exercises
- Prevent Double Voting: Each address can only vote once per distribution period (maintain
voted_addresses: Table<address, bool>on proposal) - Lockup Bonus: Addresses holding for over 30 days get 1.2x dividend weight (need to store holding timestamp)
- Multi-Asset Support: Treasury accepts both SUI and LUX, dividends also distributed proportionally in both tokens
- Auto-Execute Proposals: After proposal passes, contract automatically executes fee rate changes (requires Governor multisig)
📚 Related Documentation
- Chapter 14: On-Chain Economic System Design
- Chapter 12: Dynamic Fields & Events
- Sui Coin Standard
- Example 2: Gate Toll Station (revenue source)
Practical Case 12: Alliance Recruitment System (Application → Voting → Approval)
Objective: Build a complete alliance joining process: candidates submit applications → existing members vote → approval when threshold is reached and member NFT is issued; can also set founder veto power.
Status: Teaching example. The main text shows the minimum business loop for alliance recruitment, complete code is based on
book/src/code/example-12/.
Corresponding Code Directory
Minimal Call Chain
User applies -> Members vote -> Vote count reaches threshold -> Issue MemberNFT or confiscate deposit
Requirements Analysis
Scenario: Alliance “Death Vanguard” has 20 members, each time accepting new members requires:
- Applicant deposits 10 SUI (prevents spam applications, refunded on approval)
- Existing members vote within 72 hours (anonymous, recorded on-chain)
- Automatically approve if supporting votes ≥ 60%, issue MemberNFT
- Founder has veto power (
veto) - When rejected, deposit is confiscated and goes to alliance treasury
Part One: Alliance Recruitment Contract
module alliance::recruitment;
use sui::table::{Self, Table};
use sui::object::{Self, UID, ID};
use sui::clock::Clock;
use sui::coin::{Self, Coin};
use sui::sui::SUI;
use sui::balance::{Self, Balance};
use sui::transfer;
use sui::event;
use std::string::String;
// ── Constants ──────────────────────────────────────────────────
const VOTE_WINDOW_MS: u64 = 72 * 60 * 60 * 1000; // 72 hours
const APPROVAL_THRESHOLD_BPS: u64 = 6_000; // 60%
const APPLICATION_DEPOSIT: u64 = 10_000_000_000; // 10 SUI
// ── Data Structures ───────────────────────────────────────────────
public struct AllianceDAO has key {
id: UID,
name: String,
founder: address,
members: vector<address>,
treasury: Balance<SUI>,
pending_applications: Table<address, Application>,
total_accepted: u64,
}
public struct Application has store {
applicant: address,
applied_at_ms: u64,
votes_for: u64,
votes_against: u64,
voters: vector<address>, // Prevent duplicate voting
deposit: Balance<SUI>,
status: u8, // 0=pending, 1=approved, 2=rejected, 3=vetoed
}
/// Member NFT
public struct MemberNFT has key, store {
id: UID,
alliance_name: String,
member: address,
joined_at_ms: u64,
serial_number: u64,
}
public struct FounderCap has key, store { id: UID }
// ── Events ──────────────────────────────────────────────────
public struct ApplicationSubmitted has copy, drop { applicant: address, alliance_id: ID }
public struct VoteCast has copy, drop { applicant: address, voter: address, approve: bool }
public struct ApplicationResolved has copy, drop {
applicant: address,
approved: bool,
votes_for: u64,
votes_total: u64,
}
// ── Initialization ────────────────────────────────────────────────
public fun create_alliance(
name: vector<u8>,
ctx: &mut TxContext,
) {
let mut dao = AllianceDAO {
id: object::new(ctx),
name: std::string::utf8(name),
founder: ctx.sender(),
members: vector[ctx.sender()],
treasury: balance::zero(),
pending_applications: table::new(ctx),
total_accepted: 0,
};
// Founder gets MemberNFT (serial #1)
let founder_nft = MemberNFT {
id: object::new(ctx),
alliance_name: dao.name,
member: ctx.sender(),
joined_at_ms: 0,
serial_number: 1,
};
dao.total_accepted = 1;
let founder_cap = FounderCap { id: object::new(ctx) };
transfer::share_object(dao);
transfer::public_transfer(founder_nft, ctx.sender());
transfer::public_transfer(founder_cap, ctx.sender());
}
// ── Apply for Membership ──────────────────────────────────────────────
public fun apply(
dao: &mut AllianceDAO,
mut deposit: Coin<SUI>,
clock: &Clock,
ctx: &mut TxContext,
) {
let applicant = ctx.sender();
assert!(!vector::contains(&dao.members, &applicant), EAlreadyMember);
assert!(!table::contains(&dao.pending_applications, applicant), EAlreadyApplied);
assert!(coin::value(&deposit) >= APPLICATION_DEPOSIT, EInsufficientDeposit);
let deposit_balance = deposit.split(APPLICATION_DEPOSIT, ctx);
if coin::value(&deposit) > 0 {
transfer::public_transfer(deposit, applicant);
} else { coin::destroy_zero(deposit); }
table::add(&mut dao.pending_applications, applicant, Application {
applicant,
applied_at_ms: clock.timestamp_ms(),
votes_for: 0,
votes_against: 0,
voters: vector::empty(),
deposit: coin::into_balance(deposit_balance),
status: 0,
});
event::emit(ApplicationSubmitted { applicant, alliance_id: object::id(dao) });
}
// ── Member Voting ──────────────────────────────────────────────
public fun vote(
dao: &mut AllianceDAO,
applicant: address,
approve: bool,
_member_nft: &MemberNFT, // Must hold NFT to vote
clock: &Clock,
ctx: &TxContext,
) {
assert!(vector::contains(&dao.members, &ctx.sender()), ENotMember);
assert!(table::contains(&dao.pending_applications, applicant), ENoApplication);
let app = table::borrow_mut(&mut dao.pending_applications, applicant);
assert!(app.status == 0, EApplicationClosed);
assert!(clock.timestamp_ms() <= app.applied_at_ms + VOTE_WINDOW_MS, EVoteWindowClosed);
assert!(!vector::contains(&app.voters, &ctx.sender()), EAlreadyVoted);
vector::push_back(&mut app.voters, ctx.sender());
if approve {
app.votes_for = app.votes_for + 1;
} else {
app.votes_against = app.votes_against + 1;
};
event::emit(VoteCast { applicant, voter: ctx.sender(), approve });
// Try auto-settlement if votes are sufficient
try_resolve(dao, applicant, clock, ctx);
}
fun try_resolve(
dao: &mut AllianceDAO,
applicant: address,
clock: &Clock,
ctx: &mut TxContext,
) {
let app = table::borrow(&dao.pending_applications, applicant);
let total_votes = app.votes_for + app.votes_against;
let member_count = vector::length(&dao.members);
// Early settlement condition: approval >= 60% with at least 3 votes, or rejection > 40% covering all members
let approve_pct = total_votes * 10_000 / member_count;
let enough_approval = app.votes_for * 10_000 / member_count >= APPROVAL_THRESHOLD_BPS
&& total_votes >= 3;
let definite_rejection = app.votes_against * 10_000 / member_count > 4_000
&& total_votes == member_count;
let time_expired = clock.timestamp_ms() > app.applied_at_ms + VOTE_WINDOW_MS;
if enough_approval || time_expired || definite_rejection {
resolve_application(dao, applicant, ctx);
}
}
fun resolve_application(
dao: &mut AllianceDAO,
applicant: address,
ctx: &mut TxContext,
) {
let app = table::borrow_mut(&mut dao.pending_applications, applicant);
let total_votes = app.votes_for + app.votes_against;
let approved = total_votes > 0
&& app.votes_for * 10_000 / (total_votes) >= APPROVAL_THRESHOLD_BPS;
if approved {
app.status = 1;
// Refund deposit
let deposit = balance::withdraw_all(&mut app.deposit);
transfer::public_transfer(coin::from_balance(deposit, ctx), applicant);
// Add to member list and issue NFT
vector::push_back(&mut dao.members, applicant);
dao.total_accepted = dao.total_accepted + 1;
let nft = MemberNFT {
id: object::new(ctx),
alliance_name: dao.name,
member: applicant,
joined_at_ms: 0, // clock cannot be passed to internal function, simplified handling
serial_number: dao.total_accepted,
};
transfer::public_transfer(nft, applicant);
} else {
app.status = 2;
// Confiscate deposit to treasury
let deposit = balance::withdraw_all(&mut app.deposit);
balance::join(&mut dao.treasury, deposit);
};
event::emit(ApplicationResolved {
applicant,
approved,
votes_for: app.votes_for,
votes_total: total_votes,
});
}
/// Founder veto
public fun veto(
dao: &mut AllianceDAO,
applicant: address,
_cap: &FounderCap,
ctx: &mut TxContext,
) {
assert!(table::contains(&dao.pending_applications, applicant), ENoApplication);
let app = table::borrow_mut(&mut dao.pending_applications, applicant);
assert!(app.status == 0, EApplicationClosed);
app.status = 3;
// Confiscate deposit
let deposit = balance::withdraw_all(&mut app.deposit);
balance::join(&mut dao.treasury, deposit);
}
// ── Error Codes ────────────────────────────────────────────────
const EAlreadyMember: u64 = 0;
const EAlreadyApplied: u64 = 1;
const EInsufficientDeposit: u64 = 2;
const ENotMember: u64 = 3;
const ENoApplication: u64 = 4;
const EApplicationClosed: u64 = 5;
const EVoteWindowClosed: u64 = 6;
const EAlreadyVoted: u64 = 7;
Part Two: Recruitment Management dApp
// src/RecruitmentPanel.tsx
import { useState } from 'react'
import { useCurrentClient, useCurrentAccount } from '@mysten/dapp-kit-react'
import { useQuery } from '@tanstack/react-query'
import { Transaction } from '@mysten/sui/transactions'
import { useDAppKit } from '@mysten/dapp-kit-react'
const RECRUIT_PKG = "0x_RECRUIT_PACKAGE_"
const DAO_ID = "0x_DAO_ID_"
interface PendingApp {
applicant: string
applied_at_ms: string
votes_for: string
votes_against: string
status: string
}
export function RecruitmentPanel({ isMember, isFounder }: {
isMember: boolean, isFounder: boolean
}) {
const client = useCurrentClient()
const dAppKit = useDAppKit()
const account = useCurrentAccount()
const [status, setStatus] = useState('')
const { data: dao, refetch } = useQuery({
queryKey: ['dao', DAO_ID],
queryFn: async () => {
const obj = await client.getObject({ id: DAO_ID, options: { showContent: true } })
return (obj.data?.content as any)?.fields
},
refetchInterval: 15_000,
})
const handleApply = async () => {
const tx = new Transaction()
const [deposit] = tx.splitCoins(tx.gas, [tx.pure.u64(10_000_000_000)])
tx.moveCall({
target: `${RECRUIT_PKG}::recruitment::apply`,
arguments: [tx.object(DAO_ID), deposit, tx.object('0x6')],
})
try {
setStatus('Submitting application...')
await dAppKit.signAndExecuteTransaction({ transaction: tx })
setStatus('Application submitted! Waiting for member votes (within 72 hours)')
refetch()
} catch (e: any) { setStatus(`${e.message}`) }
}
const handleVote = async (applicant: string, approve: boolean) => {
const tx = new Transaction()
tx.moveCall({
target: `${RECRUIT_PKG}::recruitment::vote`,
arguments: [
tx.object(DAO_ID),
tx.pure.address(applicant),
tx.pure.bool(approve),
tx.object('MEMBER_NFT_ID'),
tx.object('0x6'),
],
})
try {
setStatus('Submitting vote...')
await dAppKit.signAndExecuteTransaction({ transaction: tx })
setStatus(`Voted: ${approve ? 'Approve' : 'Reject'}`)
refetch()
} catch (e: any) { setStatus(`${e.message}`) }
}
const pendingApps = dao?.pending_applications?.fields?.contents ?? []
const memberCount = dao?.members?.length ?? 0
return (
<div className="recruitment-panel">
<header>
<h1>{dao?.name ?? '...'} — Recruitment Center</h1>
<div className="stats">
<span>Members: {memberCount}</span>
<span>Pending Applications: {pendingApps.filter((a: any) => a.fields?.value?.fields?.status === '0').length}</span>
</div>
</header>
{/* Apply for Membership */}
{!isMember && (
<section className="apply-section">
<h3>Apply to Join Alliance</h3>
<p>Requires 10 SUI deposit (refunded on approval). Existing members will vote within 72 hours.</p>
<button className="apply-btn" onClick={handleApply}>
Submit Application (10 SUI deposit)
</button>
</section>
)}
{/* Pending Applications (Members Only) */}
{isMember && (
<section className="pending-section">
<h3>Pending Applications</h3>
{pendingApps.map((entry: any) => {
const app = entry.fields?.value?.fields
if (!app || app.status !== '0') return null
const hoursLeft = Math.max(0,
Math.ceil((Number(app.applied_at_ms) + 72*3600_000 - Date.now()) / 3_600_000)
)
const totalVotes = Number(app.votes_for) + Number(app.votes_against)
const pct = memberCount > 0 ? Math.round(Number(app.votes_for) * 100 / memberCount) : 0
return (
<div key={entry.fields?.key} className="application-card">
<div className="applicant-info">
<strong>{entry.fields?.key?.slice(0, 8)}...</strong>
<span className="time-left">{hoursLeft}h remaining</span>
</div>
<div className="vote-bar">
<div className="vote-fill" style={{ width: `${pct}%` }} />
<span>{app.votes_for} Approve / {app.votes_against} Reject ({totalVotes}/{memberCount} voted)</span>
</div>
<div className="vote-buttons">
<button className="btn-approve" onClick={() => handleVote(entry.fields?.key, true)}>
Approve
</button>
<button className="btn-reject" onClick={() => handleVote(entry.fields?.key, false)}>
Reject
</button>
{isFounder && (
<button className="btn-veto" onClick={() => {}}>
Veto
</button>
)}
</div>
</div>
)
})}
</section>
)}
{status && <p className="status">{status}</p>}
</div>
)
}
Key Design Highlights
| Mechanism | Implementation |
|---|---|
| Prevent spam applications | 10 SUI deposit, confiscated on rejection |
| Prevent duplicate voting | voters vector tracks members who voted |
| Auto settlement | Check if threshold is reached after each vote |
| Veto power | FounderCap authorized veto() |
| Member credential | MemberNFT as voting and permission carrier |
Related Documentation
Practical Case 15: Decentralized Item Insurance
Objective: Build an on-chain item insurance protocol—players purchase PvP battle damage insurance, if items are destroyed in-game they receive automatic compensation through server proof (AdminACL), claims paid from insurance pool.
Status: Teaching example. The main text emphasizes claims process and fund pool design, complete directory is based on
book/src/code/example-15/.
Corresponding Code Directory
Minimal Call Chain
User purchases policy -> Server issues battle damage proof -> Contract verifies policy and signature -> Insurance pool pays out
Test Loop
- Successful purchase: Confirm 70/30 split of
claims_pool/reserveis correct - Valid claim within period: Confirm payout equals
coverage_amount - Expired claim rejection: Confirm expired policies cannot file claims
- Insufficient claims pool: Confirm no negative balance or duplicate deductions
Requirements Analysis
Scenario: Player brings a rare shield worth 500 SUI into PvP combat. They pay 15 SUI for 30-day item insurance, if the shield is destroyed in battle:
- Game server records death event
- Player submits claim application + server signature (AdminACL verification)
- Contract verifies policy is within validity period, automatically pays out (80% payout rate)
Contract
module insurance::pvp_shield;
use sui::object::{Self, UID, ID};
use sui::clock::Clock;
use sui::coin::{Self, Coin};
use sui::sui::SUI;
use sui::balance::{Self, Balance};
use sui::table::{Self, Table};
use sui::transfer;
use sui::event;
// ── Constants ──────────────────────────────────────────────────
const COVERAGE_BPS: u64 = 8_000; // 80% payout rate
const DAY_MS: u64 = 86_400_000;
const MIN_PREMIUM_BPS: u64 = 300; // Minimum premium: 3% of coverage/month
// ── Data Structures ───────────────────────────────────────────────
/// Insurance pool (shared)
public struct InsurancePool has key {
id: UID,
reserve: Balance<SUI>, // Reserve fund
total_collected: u64, // Total premiums collected
total_paid_out: u64, // Total payouts
claims_pool: Balance<SUI>, // Dedicated claims pool (70% of premiums)
admin: address,
}
/// Policy NFT
public struct PolicyNFT has key, store {
id: UID,
insured_item_id: ID, // Insured item ObjectID
insured_value: u64, // Coverage amount (SUI)
coverage_amount: u64, // Maximum payout (= insured_value × 80%)
valid_until_ms: u64, // Expiration date
is_claimed: bool,
policy_holder: address,
}
// ── Events ──────────────────────────────────────────────────
public struct PolicyIssued has copy, drop {
policy_id: ID,
holder: address,
insured_item_id: ID,
coverage: u64,
expires_ms: u64,
}
public struct ClaimPaid has copy, drop {
policy_id: ID,
holder: address,
amount_paid: u64,
}
// ── Initialization ────────────────────────────────────────────────
fun init(ctx: &mut TxContext) {
transfer::share_object(InsurancePool {
id: object::new(ctx),
reserve: balance::zero(),
total_collected: 0,
total_paid_out: 0,
claims_pool: balance::zero(),
admin: ctx.sender(),
});
}
// ── Purchase Insurance ──────────────────────────────────────────────
public fun purchase_policy(
pool: &mut InsurancePool,
insured_item_id: ID, // Insured item's ObjectID
insured_value: u64, // Declared coverage amount
days: u64, // Insurance days (1-90)
mut premium: Coin<SUI>,
clock: &Clock,
ctx: &mut TxContext,
) {
assert!(days >= 1 && days <= 90, EInvalidDuration);
// Calculate premium: insured_value × monthly rate × days
let monthly_premium = insured_value * MIN_PREMIUM_BPS / 10_000;
let required_premium = monthly_premium * days / 30;
assert!(coin::value(&premium) >= required_premium, EInsufficientPremium);
let pay = premium.split(required_premium, ctx);
let premium_amount = coin::value(&pay);
// 70% to claims pool, 30% to reserve
let claims_share = premium_amount * 70 / 100;
let reserve_share = premium_amount - claims_share;
let mut pay_balance = coin::into_balance(pay);
let claims_portion = balance::split(&mut pay_balance, claims_share);
balance::join(&mut pool.claims_pool, claims_portion);
balance::join(&mut pool.reserve, pay_balance);
pool.total_collected = pool.total_collected + premium_amount;
if coin::value(&premium) > 0 {
transfer::public_transfer(premium, ctx.sender());
} else { coin::destroy_zero(premium); }
let coverage = insured_value * COVERAGE_BPS / 10_000;
let valid_until_ms = clock.timestamp_ms() + days * DAY_MS;
let policy = PolicyNFT {
id: object::new(ctx),
insured_item_id,
insured_value,
coverage_amount: coverage,
valid_until_ms,
is_claimed: false,
policy_holder: ctx.sender(),
};
let policy_id = object::id(&policy);
transfer::public_transfer(policy, ctx.sender());
event::emit(PolicyIssued {
policy_id,
holder: ctx.sender(),
insured_item_id,
coverage,
expires_ms: valid_until_ms,
});
}
// ── File Claim (requires game server signature proving item destruction) ────────────
public fun file_claim(
pool: &mut InsurancePool,
policy: &mut PolicyNFT,
admin_acl: &AdminACL, // Game server verifies item is actually destroyed
clock: &Clock,
ctx: &mut TxContext,
) {
// Verify server signature (i.e., server confirms item has been destroyed)
verify_sponsor(admin_acl, ctx);
assert!(!policy.is_claimed, EAlreadyClaimed);
assert!(clock.timestamp_ms() <= policy.valid_until_ms, EPolicyExpired);
assert!(policy.policy_holder == ctx.sender(), ENotPolicyHolder);
// Check if claims pool has sufficient balance
let payout = policy.coverage_amount;
assert!(balance::value(&pool.claims_pool) >= payout, EInsufficientClaimsPool);
// Mark as claimed (prevent duplicate claims)
policy.is_claimed = true;
// Payout
let payout_coin = coin::take(&mut pool.claims_pool, payout, ctx);
pool.total_paid_out = pool.total_paid_out + payout;
transfer::public_transfer(payout_coin, ctx.sender());
event::emit(ClaimPaid {
policy_id: object::id(policy),
holder: ctx.sender(),
amount_paid: payout,
});
}
/// Admin replenishes claims pool from reserve (when claims pool is insufficient)
public fun replenish_claims_pool(
pool: &mut InsurancePool,
amount: u64,
ctx: &TxContext,
) {
assert!(ctx.sender() == pool.admin, ENotAdmin);
assert!(balance::value(&pool.reserve) >= amount, EInsufficientReserve);
let replenish = balance::split(&mut pool.reserve, amount);
balance::join(&mut pool.claims_pool, replenish);
}
const EInvalidDuration: u64 = 0;
const EInsufficientPremium: u64 = 1;
const EAlreadyClaimed: u64 = 2;
const EPolicyExpired: u64 = 3;
const ENotPolicyHolder: u64 = 4;
const EInsufficientClaimsPool: u64 = 5;
const ENotAdmin: u64 = 6;
const EInsufficientReserve: u64 = 7;
dApp (Purchase and Claims)
// InsuranceApp.tsx
import { useState } from 'react'
import { Transaction } from '@mysten/sui/transactions'
import { useDAppKit } from '@mysten/dapp-kit-react'
const INS_PKG = "0x_INSURANCE_PACKAGE_"
const POOL_ID = "0x_POOL_ID_"
export function InsuranceApp() {
const dAppKit = useDAppKit()
const [value, setValue] = useState(500) // Coverage amount (SUI)
const [days, setDays] = useState(30)
const [status, setStatus] = useState('')
// Premium calculation
const premium = (value * 0.03 * days / 30).toFixed(2)
const coverage = (value * 0.8).toFixed(2)
const purchase = async () => {
const tx = new Transaction()
const premiumMist = BigInt(Math.ceil(Number(premium) * 1e9))
const [payment] = tx.splitCoins(tx.gas, [tx.pure.u64(premiumMist)])
tx.moveCall({
target: `${INS_PKG}::pvp_shield::purchase_policy`,
arguments: [
tx.object(POOL_ID),
tx.pure.id('0x_ITEM_OBJECT_ID_'),
tx.pure.u64(value * 1e9),
tx.pure.u64(days),
payment,
tx.object('0x6'),
],
})
try {
setStatus('Purchasing insurance...')
await dAppKit.signAndExecuteTransaction({ transaction: tx })
setStatus('Policy activated! PolicyNFT sent to wallet')
} catch (e: any) { setStatus(`${e.message}`) }
}
return (
<div className="insurance-app">
<h1>PvP Item Battle Damage Insurance</h1>
<div className="config-section">
<label>Coverage Amount (SUI)</label>
<input type="range" min={100} max={5000} step={50}
value={value} onChange={e => setValue(Number(e.target.value))} />
<span>{value} SUI</span>
<label>Insurance Days</label>
{[7, 14, 30, 60, 90].map(d => (
<button key={d} className={days === d ? 'selected' : ''} onClick={() => setDays(d)}>
{d} days
</button>
))}
</div>
<div className="summary-card">
<div className="summary-row">
<span>Coverage</span><strong>{value} SUI</strong>
</div>
<div className="summary-row">
<span>Maximum Payout</span><strong>{coverage} SUI</strong>
</div>
<div className="summary-row">
<span>Premium</span><strong>{premium} SUI</strong>
</div>
<div className="summary-row">
<span>Valid Period</span><strong>{days} days</strong>
</div>
</div>
<button className="purchase-btn" onClick={purchase}>
Purchase Insurance ({premium} SUI)
</button>
{status && <p className="status">{status}</p>}
</div>
)
}
Related Documentation
- Chapter 8: Sponsored Transactions and AdminACL Verification
- Chapter 14: Economic System and Fund Pools
Practical Case 17: In-Game Overlay dApp Practice (Toll Station In-Game Version)
Objective: Transform Example 2’s stargate toll station dApp into an in-game overlay version—automatically pop up ticket purchase panel when player approaches stargate, complete signing and jumping without leaving the game.
Status: Teaching example. Current case focuses on dApp overlay transformation, contract part reuses Example 2.
Corresponding Code Directory
Minimal Call Chain
In-game event -> postMessage -> Overlay dApp updates state -> User signs -> Purchase/jump success -> Overlay closes
Requirements Analysis
Scenario: Toll station logic already exists (reuse Example 2 contract), now need to:
- Game client detects player entering 100km range of stargate
- Send event to WebView overlay via
postMessage - Overlay pops up ticket purchase panel, displays fee and destination
- Player clicks once, EVE Vault pops up signature confirmation
- After signing completes, show success animation and auto-close
This case focuses on Chapter 20 engineering practice, with more complete code details.
Project Structure
ingame-toll-overlay/
├── index.html
├── src/
│ ├── main.tsx # Entry, Provider setup
│ ├── App.tsx # Environment detection and routing
│ ├── overlay/
│ │ ├── TollOverlay.tsx # In-game overlay main component
│ │ ├── JumpPanel.tsx # Ticket purchase panel
│ │ └── SuccessAnimation.tsx # Success animation
│ └── lib/
│ ├── gameEvents.ts # postMessage listener
│ ├── environment.ts # Environment detection
│ └── contracts.ts # Contract constants
├── ingame.css # Overlay styles
└── vite.config.ts
Part One: Game Event Listener
// src/lib/gameEvents.ts
export interface GateAproachEvent {
type: "GATE_IN_RANGE"
gateId: string
gateName: string
destinationSystemName: string
distanceKm: number
}
export interface PlayerLeftEvent {
type: "GATE_OUT_OF_RANGE"
gateId: string
}
export type OverlayEvent = GateAproachEvent | PlayerLeftEvent
type Listener = (event: OverlayEvent) => void
const listeners = new Set<Listener>()
let initialized = false
export function initGameEventListener() {
if (initialized) return
initialized = true
window.addEventListener("message", (e: MessageEvent) => {
if (e.data?.source !== "EVEFrontierClient") return
const event = e.data as { source: string } & OverlayEvent
if (!event.type) return
listeners.forEach(fn => fn(event))
})
}
export function addGameEventListener(fn: Listener): () => void {
listeners.add(fn)
return () => listeners.delete(fn)
}
// ── Development/testing: simulate game events ─────────────────────────────
export function simulateGateApproach(gateId: string) {
const mockEvent: GateAproachEvent = {
type: "GATE_IN_RANGE",
gateId,
gateName: "Alpha Gate Alpha-7",
destinationSystemName: "Trade Hub IV",
distanceKm: 78,
}
window.dispatchEvent(
new MessageEvent("message", {
data: { source: "EVEFrontierClient", ...mockEvent },
})
)
}
Part Two: Main Overlay Component
// src/overlay/TollOverlay.tsx
import { useEffect, useState, useCallback } from 'react'
import {
initGameEventListener,
addGameEventListener,
GateAproachEvent,
} from '../lib/gameEvents'
import { JumpPanel } from './JumpPanel'
import { SuccessAnimation } from './SuccessAnimation'
type OverlayState = 'hidden' | 'visible' | 'success'
export function TollOverlay() {
const [state, setState] = useState<OverlayState>('hidden')
const [activeGate, setActiveGate] = useState<GateAproachEvent | null>(null)
useEffect(() => {
initGameEventListener()
return addGameEventListener((event) => {
if (event.type === 'GATE_IN_RANGE') {
setActiveGate(event)
setState('visible')
} else if (event.type === 'GATE_OUT_OF_RANGE') {
if (state !== 'success') setState('hidden')
}
})
}, [state])
const handleSuccess = useCallback(() => {
setState('success')
// Auto-close after 3 seconds
setTimeout(() => {
setState('hidden')
setActiveGate(null)
}, 3000)
}, [])
const handleDismiss = useCallback(() => {
setState('hidden')
}, [])
if (state === 'hidden') return null
return (
<div className="overlay-container">
<div className={`overlay-panel ${state === 'success' ? 'overlay-panel--success' : ''}`}>
{state === 'success' ? (
<SuccessAnimation />
) : (
activeGate && (
<JumpPanel
gateEvent={activeGate}
onSuccess={handleSuccess}
onDismiss={handleDismiss}
/>
)
)}
</div>
</div>
)
}
Part Three: Ticket Purchase Panel
// src/overlay/JumpPanel.tsx
import { useState } from 'react'
import { useQuery } from '@tanstack/react-query'
import { useCurrentClient } from '@mysten/dapp-kit-react'
import { useDAppKit } from '@mysten/dapp-kit-react'
import { Transaction } from '@mysten/sui/transactions'
import { GateAproachEvent } from '../lib/gameEvents'
import { TOLL_PKG, ADMIN_ACL_ID, CHARACTER_ID } from '../lib/contracts'
interface JumpPanelProps {
gateEvent: GateAproachEvent
onSuccess: () => void
onDismiss: () => void
}
export function JumpPanel({ gateEvent, onSuccess, onDismiss }: JumpPanelProps) {
const client = useCurrentClient()
const dAppKit = useDAppKit()
const [buying, setBuying] = useState(false)
// Read toll for this stargate
const { data: tollInfo } = useQuery({
queryKey: ['gate-toll', gateEvent.gateId],
queryFn: async () => {
const obj = await client.getObject({
id: gateEvent.gateId,
options: { showContent: true },
})
const fields = (obj.data?.content as any)?.fields
return {
tollAmount: Number(fields?.toll_amount ?? 0),
destinationGateId: fields?.linked_gate_id,
}
},
})
const tollSUI = ((tollInfo?.tollAmount ?? 0) / 1e9).toFixed(2)
const handleBuy = async () => {
if (!tollInfo) return
setBuying(true)
const tx = new Transaction()
const [payment] = tx.splitCoins(tx.gas, [tx.pure.u64(tollInfo.tollAmount)])
tx.moveCall({
target: `${TOLL_PKG}::toll_gate_ext::pay_toll_and_get_permit`,
arguments: [
tx.object(gateEvent.gateId), // Source stargate
tx.object(tollInfo.destinationGateId), // Destination stargate
tx.object(CHARACTER_ID), // Character object
payment,
tx.object(ADMIN_ACL_ID),
tx.object('0x6'), // Clock
],
})
try {
// Call sponsored transaction (server verifies proximity then sponsors gas)
await dAppKit.signAndExecuteSponsoredTransaction({ transaction: tx })
onSuccess()
} catch (e: any) {
console.error(e)
setBuying(false)
}
}
return (
<div className="jump-panel">
{/* Close button */}
<button className="dismiss-btn" onClick={onDismiss} aria-label="Close">✕</button>
{/* Stargate info */}
<div className="gate-icon">🌀</div>
<h2 className="gate-name">{gateEvent.gateName}</h2>
<p className="destination">
Destination: <strong>{gateEvent.destinationSystemName}</strong>
</p>
<p className="distance">Distance: {gateEvent.distanceKm} km</p>
{/* Fee */}
<div className="toll-display">
<span className="toll-label">Toll Fee</span>
<span className="toll-amount">{tollSUI} SUI</span>
</div>
{/* Purchase button */}
<button
className="jump-btn"
onClick={handleBuy}
disabled={buying || !tollInfo}
>
{buying ? 'Signing...' : 'Purchase Ticket & Jump'}
</button>
<p className="jump-hint">Permit valid for 30 minutes</p>
</div>
)
}
Part Four: Success Animation
// src/overlay/SuccessAnimation.tsx
import { useEffect, useState } from 'react'
export function SuccessAnimation() {
const [frame, setFrame] = useState(0)
const frames = ['🌌', '⚡', '🌀', '✨', '🚀']
useEffect(() => {
const timer = setInterval(() => {
setFrame(f => (f + 1) % frames.length)
}, 200)
return () => clearInterval(timer)
}, [])
return (
<div className="success-animation">
<div className="animation-icon">{frames[frame]}</div>
<h2>Jump Successful!</h2>
<p>Warping to destination...</p>
</div>
)
}
In-Game Specific CSS
/* ingame.css */
.overlay-container {
position: fixed;
right: 16px;
top: 50%;
transform: translateY(-50%);
z-index: 9999;
width: 320px;
}
.overlay-panel {
background: rgba(8, 12, 24, 0.95);
border: 1px solid rgba(96, 180, 255, 0.5);
border-radius: 12px;
padding: 20px;
color: #d0e8ff;
font-family: 'Share Tech Mono', monospace;
backdrop-filter: blur(12px);
animation: slideIn 0.25s ease;
box-shadow: 0 0 30px rgba(96, 180, 255, 0.15);
}
@keyframes slideIn {
from { opacity: 0; transform: translateX(30px); }
to { opacity: 1; transform: translateX(0); }
}
.jump-btn {
width: 100%;
padding: 14px;
background: linear-gradient(135deg, #1a5cff, #0a3acc);
border: none;
border-radius: 8px;
color: white;
font-size: 15px;
font-family: inherit;
letter-spacing: 0.05em;
text-transform: uppercase;
cursor: pointer;
transition: all 0.2s;
}
.jump-btn:hover:not(:disabled) {
background: linear-gradient(135deg, #2a6cff, #1a4aee);
box-shadow: 0 0 20px rgba(26, 92, 255, 0.4);
}
.toll-display {
display: flex;
justify-content: space-between;
align-items: center;
background: rgba(255,255,255,0.05);
border-radius: 8px;
padding: 12px 16px;
margin: 16px 0;
}
.toll-amount {
font-size: 24px;
font-weight: bold;
color: #4fa3ff;
}
.success-animation {
text-align: center;
padding: 24px 0;
animation-icon { font-size: 48px; }
}
Related Documentation
- Chapter 20: In-Game dApp Integration
- Chapter 8: Sponsored Transactions
- Example 2: Stargate Toll Station (Contract Layer)
Chapter 26: Complete Analysis of Access Control System
Learning Objective: Deeply understand the complete permission architecture of the
world::accessmodule—fromGovernorCap,AdminACL,OwnerCaptoReceivingpattern, master the precise design of EVE Frontier’s access control system.
Status: Teaching example. Access control details are numerous, recommended to read section by section directly against source code and tests, rather than just looking at concept diagrams.
Minimal Call Chain
Entry point -> Permission object/authorization list check -> Borrow or consume capability -> Execute business action -> Return or destroy capability object
Corresponding Code Directories
Key Structs
| Type | Purpose | Reading Focus |
|---|---|---|
AdminACL | Server authorization whitelist | See how sponsor whitelist is maintained |
GovernorCap | System-level highest permission capability | See which actions must go through governor not owner |
OwnerCap<T> | Generic ownership credential | See three lifecycles: borrowing, returning, transferring |
Receiving related patterns | Safe borrowing of object-owned assets | See difference between object-owned and address-owned |
ServerAddressRegistry | Server address registry | See how signature identity and business permissions connect |
Key Entry Functions
| Entry | Purpose | What You Should Verify |
|---|---|---|
verify_sponsor | Check if submitter is in server whitelist | It solves identity source, not all business constraints |
borrow_owner_cap / return_owner_cap | Borrow and return ownership credentials | Whether strictly follows Borrow-Use-Return |
| governor / registry management entries | Maintain system-level permission configurations | Whether system admin rights are incorrectly delegated to regular owners |
Most Easily Misread Points
ctx.sender()in EVE Frontier usually isn’t enough, many scenarios must check capability or sponsorOwnerCap<T>isn’t one-time consumable, often temporarily borrowed then returned- object-owned assets cannot copy address-owned permission judgment methods
The most effective way to understand this chapter is to break permissions into 3 sources: address identity, capability object, server endorsement. Address identity answers “who sent this transaction”; capability object answers “what control over which specific object does he have”; server endorsement answers “is this a system action recognized by the game world”. EVE Frontier uses all three sources simultaneously because relying on ctx.sender() alone cannot express complex item trusteeship, building control, and off-chain state injection.
1. Why Is Access Control System Complex?
Traditional smart contract permissions usually have only two layers: owner (owner) and public. EVE Frontier needs more precise control:
Game Company (CCP Level) → GovernorCap: system-level configuration
├── Game Server → AdminACL/verify_sponsor: on-chain operation authorization
├── Building Owner (Builder) → OwnerCap<T>: building control
└── Player (Character) → Access own items through OwnerCap
One character’s items in another player’s building—who can operate this item? This is the core problem EVE Frontier access control needs to solve.
So you’ll see EVE permissions don’t revolve around “is a certain address the owner,” but around “who currently holds a certain object, who can temporarily borrow, who can represent server to write world state”. Once the object world becomes complex, the single owner field common in traditional contracts isn’t fine enough.
2. AdminACL: Server Authorization Whitelist
// world/sources/access/access_control.move
pub struct AdminACL has key {
id: UID,
authorized_sponsors: Table<address, bool>, // Server address whitelist
}
/// Only allow registered servers to execute privileged operations
pub fun verify_sponsor(admin_acl: &AdminACL, ctx: &TxContext) {
assert!(
admin_acl.authorized_sponsors.contains(ctx.sender()),
EUnauthorizedSponsor,
);
}
Usage: All operations in World contract requiring game server permissions start with admin_acl.verify_sponsor(ctx):
// Create character (must be triggered by server)
pub fun create_character(..., admin_acl: &AdminACL, ...) {
admin_acl.verify_sponsor(ctx);
// ...
}
// Create KillMail (must be triggered by server)
pub fun create_killmail(..., admin_acl: &AdminACL, ...) {
admin_acl.verify_sponsor(ctx);
// ...
}
Server Address Registration (Only GovernorCap Can Operate)
pub fun add_sponsor_to_acl(
admin_acl: &mut AdminACL,
_: &GovernorCap, // Requires highest permission
sponsor: address,
) {
admin_acl.authorized_sponsors.add(sponsor, true);
}
3. GovernorCap: System Highest Permission
// GovernorCap is the system's "root key"
// Its existence means game company retains system-level configuration capability
pub struct GovernorCap has key, store { id: UID }
GovernorCap is used for:
- Adding/removing server addresses to
AdminACL - Registering servers to
ServerAddressRegistry(for signature verification) - Setting system-wide configuration parameters
pub fun register_server_address(
server_address_registry: &mut ServerAddressRegistry,
_: &GovernorCap,
server_address: address,
) {
server_address_registry.authorized_address.add(server_address, true);
}
4. OwnerCap<T>: Generic Ownership Credential
This is EVE Frontier access control’s most ingenious design:
/// OwnerCap<T> proves holder's control over some T type object
pub struct OwnerCap<phantom T: key> has key, store {
id: UID,
authorized_object_id: ID, // Bound to specific object ID
}
Why use generics?
OwnerCap<Gate> // Control over some Gate
OwnerCap<Turret> // Control over some Turret
OwnerCap<StorageUnit> // Control over some StorageUnit
OwnerCap<Character> // Control over some Character
Type system naturally ensures permissions won’t be used across types incorrectly.
OwnerCap Creation (Only AdminACL Can Create)
pub fun create_owner_cap<T: key>(
admin_acl: &AdminACL,
obj: &T,
ctx: &mut TxContext,
): OwnerCap<T> {
admin_acl.verify_sponsor(ctx);
let object_id = object::id(obj);
let owner_cap = OwnerCap<T> {
id: object::new(ctx),
authorized_object_id: object_id,
};
event::emit(OwnerCapCreatedEvent { ... });
owner_cap
}
Important constraint: Players cannot create OwnerCap themselves, can only be issued by game server (verify_sponsor).
This layer constraint’s significance is keeping “permission object minting rights” firmly within system boundaries. Otherwise once anyone can mint OwnerCap<T> themselves, the entire capability system loses credibility. Capability objects are reliable not just because they’re on-chain objects, but because their source chain itself is controlled.
5. Receiving Pattern: Safe Borrowing of OwnerCap
This is one of EVE Frontier’s most unique patterns—OwnerCap is usually stored under Character object’s control, borrowed temporarily using Sui’s Receiving<T> when needed:
Character (shared object)
└── Holds → OwnerCap<Gate> (stored via Sui transfer::transfer)
When player operates:
1. Player submits Receiving<OwnerCap<Gate>> ticket (proves right to extract)
2. character::receive_owner_cap() → Temporarily extract OwnerCap<Gate>
3. Execute operation (like modifying Gate configuration)
4. Use return_owner_cap_to_object() to return OwnerCap to Character
Source Code Implementation
/// Borrow OwnerCap from Character
pub(package) fun receive_owner_cap<T: key>(
receiving_id: &mut UID,
ticket: Receiving<OwnerCap<T>>, // Sui native Receiving ticket
): OwnerCap<T> {
transfer::receive(receiving_id, ticket)
}
/// Return OwnerCap to Character
pub fun return_owner_cap_to_object<T: key>(
owner_cap: OwnerCap<T>,
character: &mut Character,
receipt: ReturnOwnerCapReceipt, // Receipt after operation completes
) {
validate_return_receipt(receipt, object::id(&owner_cap), ...);
transfer::transfer(owner_cap, character.character_address);
}
ReturnOwnerCapReceipt Prevents Loss
pub struct ReturnOwnerCapReceipt {
owner_id: address,
owner_cap_id: ID,
}
In function signature borrowing OwnerCap, must return ReturnOwnerCapReceipt, otherwise compilation error. This ensures:
- OwnerCap will definitely be returned (cannot be lost)
- Must be used in pairs (cannot forge receipt)
Receiving pattern seems a bit tedious on surface, essentially making object-owned lifecycle explicit. Things held by regular addresses, you can use with references; but capabilities held by objects like Character, StorageUnit, without a set of explicit “borrow-use-return” process, easily get lost or intercepted in complex call chains. EVE chooses to make this process verbose in exchange for auditable, rollbackable, strongly constrained permission flow.
6. Complete Permission Hierarchy Diagram
GovernorCap (root key, CCP holds)
│
▼ Configure
AdminACL (server whitelist)
│
▼ verify_sponsor
All privileged operations (create character, create building, issue OwnerCap...)
│
▼ create_owner_cap<T>
OwnerCap<Gate> OwnerCap<Turret> OwnerCap<StorageUnit>...
│ │
▼ Transfer to Character ▼ Transfer to Builder player
Character custody (Receiving pattern) Direct holding
│
▼ receive_owner_cap (Receiving<OwnerCap<Gate>>)
Temporarily borrow → Use → Return
7. ServerAddressRegistry: Signature Verification Whitelist
Unlike AdminACL, ServerAddressRegistry is specifically for signature verification (not function call permissions):
pub struct ServerAddressRegistry has key {
id: UID,
authorized_address: Table<address, bool>,
}
pub fun is_authorized_server_address(
registry: &ServerAddressRegistry,
server_address: address,
): bool {
registry.authorized_address.contains(server_address)
}
Purpose: Verify signature source in location::verify_proximity:
assert!(
access::is_authorized_server_address(server_registry, message.server_address),
EUnauthorizedServer,
);
Here we can also see division of labor between AdminACL and ServerAddressRegistry: former leans toward “who can directly represent server to send transactions”, latter leans toward “whose off-chain signatures can be recognized on-chain”. They often come from same batch of backend systems, but semantics aren’t the same. Mixing them into one table saves effort short term, long term makes permission surface very hard to shrink.
8. Builder Perspective: How to Properly Use OwnerCap
When Creating Building
// When game server creates Gate for Builder, automatically creates and transfers OwnerCap<Gate>
pub fun create_gate_with_owner(...) {
admin_acl.verify_sponsor(ctx);
let gate = Gate { ... };
let owner_cap = create_owner_cap(&admin_acl, &gate, ctx);
// owner_cap transferred to builder, builder controls this Gate
transfer::share_object(gate);
transfer::public_transfer(owner_cap, builder_address);
}
When Builder Modifies Building Configuration
// Builder uses OwnerCap to prove they have right to operate the Gate
pub fun set_gate_config(
gate: &mut Gate,
owner_cap: &OwnerCap<Gate>, // Holding grants permission
new_config: GateConfig,
ctx: &TxContext,
) {
// Verify OwnerCap's corresponding object ID matches gate
assert!(owner_cap.authorized_object_id == object::id(gate), EOwnerCapMismatch);
gate.config = new_config;
}
9. Comparison: EVE vs Traditional Contract Permissions
| Scenario | Traditional Contract | EVE Frontier |
|---|---|---|
| Building ownership | Record owner address | OwnerCap<T> object |
| Transfer ownership | Update address field | Transfer OwnerCap<T> object |
| Lend permissions | No standard mechanism | Receiving pattern + ReturnReceipt |
| Server permissions | Hardcoded address | AdminACL (updatable whitelist) |
| Signature verification | None | ServerAddressRegistry |
10. Security Trap: Don’t Hold Too Many OwnerCaps
OwnerCap has has key, store, meaning it can be stored in any object or table. Builder needs to be careful:
❌ Bad design: Store OwnerCap in public shared object
→ Anyone might call using some vulnerability
✅ Correct design:
- OwnerCap stored in deployer's personal wallet address
- Or managed through Character's Receiving pattern
- Important operations use multi-sig wallet with OwnerCap
More bluntly, OwnerCap<T> should be treated as control plane key, not regular business asset. It shouldn’t be casually placed in public shared objects, nor exposed to too many intermediate contracts for “frontend convenience”. You can compare it to root key in operations: truly secure systems don’t lack root keys, but root keys appear rarely, circulate rarely, and always accompanied by additional process constraints when they appear.
11. Practical Exercises
- Permission Analysis: List all functions in World contract requiring
admin_acl.verify_sponsor(ctx), analyze which players can never directly call - OwnerCap Delegation System: Design a contract allowing Gate Owner to delegate partial permissions (like modifying toll fees) to another address without transferring OwnerCap itself
- Multi-sig OwnerCap Custody: Implement a 2-of-3 multi-sig account where three maintainers need two to agree to modify building configuration
Chapter Summary
| Component | Layer | Purpose |
|---|---|---|
GovernorCap | Highest (CCP) | System-level configuration, register servers |
AdminACL | Server layer | Game operation function call authorization |
ServerAddressRegistry | Server layer | Ed25519 signature source verification |
OwnerCap<T> | Building layer | Generic building control credential |
| Receiving pattern | Player layer | OwnerCap safe borrowing mechanism |
ReturnOwnerCapReceipt | Security mechanism | Force OwnerCap return, prevent loss |
Course Complete
Congratulations on completing the EVE Frontier Builder Complete Course!
From basic Move 2024 syntax, to on-chain PvP records (KillMail), to signature verification, location proof, energy fuel systems, Extension pattern, turret AI and access control—you’ve mastered all core knowledge needed to build complex applications on EVE Frontier.
Next Steps:
- Join EVE Frontier Builders Discord
- Deploy your first Extension on testnet
- Find your galaxy in the game, light up a Smart Gate
Building in the stars is an extension of civilization.
Chapter 27: Off-Chain Signature × On-Chain Verification
Learning Objective: Deeply understand the Ed25519 signature verification mechanism in the
world::sig_verifymodule, and master the core security pattern of EVE Frontier: “game server signature → Move contract verification.”
Status: Teaching example. The verification flow in the text is a breakdown of the official implementation. When implementing, prioritize comparing with actual source code and tests.
Minimal Call Chain
Game server constructs message -> Ed25519 signature -> Player submits bytes/signature -> sig_verify module validates -> Contract continues execution
Corresponding Code Directory
Key Structs / Inputs
| Type or Input | Purpose | Reading Focus |
|---|---|---|
| Message bytes | Raw encoding of off-chain facts | Check if off-chain signature and on-chain verification use exactly the same byte sequence |
| Signature blob | flag + raw_sig + public_key | Check length, slice order, and signature algorithm identifier |
AdminACL / authorized address | Business-allowed server identity | Check that “signature correct” and “signer authorized” are two layers of validation |
Key Entry Functions
| Entry | Purpose | What to Confirm |
|---|---|---|
sig_verify related validation entry | Verify signature binding to message | Whether intent prefix is correctly added, whether bytes are strictly compared |
| Business contract validation wrapper function | Connect signature verification to business flow | Whether nonce, expiration time, object binding are validated together |
| sponsor / server whitelist entry | Restrict acceptable server identities | Whether it’s handled in layers separate from signature validation |
Most Easily Misunderstood Points
- Signature passing doesn’t equal business passing; business fields still need separate validation
- If even one byte differs in off-chain signature encoding, on-chain verification will inevitably fail
AdminACLsolves “who can submit/sponsor,” not “message content is definitely correct”
When reading the signature system, it’s recommended to break verification into 4 layers, not mix into one “verification passed means safe”:
- Byte layer: Are the
message_bytesseen off-chain and on-chain exactly identical? - Cryptographic layer: Was the signature truly generated by that private key?
- Identity layer: Does the address corresponding to this private key belong to an allowed server?
- Business layer: Are the message fields like player, object, deadline, nonce, quantity truly matching this call?
sig_verify only handles the first two layers and part of the third. What truly determines business security is how strict your outer wrapper function is.
1. Why Off-Chain Signatures?
A fundamental challenge in EVE Frontier: on-chain contracts cannot access real-time game world state.
| Information | Source | Contract Can Read Directly? |
|---|---|---|
| Player ship position coordinates | Game server real-time calculation | ❌ |
| Whether a player is near a building | Game physics engine | ❌ |
| Today’s PvP kill results | Game combat server | ❌ |
| On-chain object state | Sui state tree | ✅ |
Solution: The game server signs these “facts” into a message off-chain, the player submits this signature to the contract, and the contract verifies the signature’s authenticity.
2. Ed25519 Signature Format
Sui uses standard Ed25519 + personal message signature format.
Signature Composition
signature (97 bytes total):
┌─────────┬───────────────────┬──────────────────┐
│ flag │ raw_sig │ public_key │
│ 1 byte │ 64 bytes │ 32 bytes │
│ (0x00) │ (Ed25519 sig) │ (Ed25519 PK) │
└─────────┴───────────────────┴──────────────────┘
Constant Definitions (from source code)
const ED25519_FLAG: u8 = 0x00; // Ed25519 scheme identifier
const ED25519_SIG_LEN: u64 = 64; // Signature length
const ED25519_PK_LEN: u64 = 32; // Public key length
3. Source Code Deep Dive: sig_verify.move
3.1 Deriving Sui Address from Public Key
pub fun derive_address_from_public_key(public_key: vector<u8>): address {
assert!(public_key.length() == ED25519_PK_LEN, EInvalidPublicKeyLen);
// Sui address = Blake2b256(flag_byte || public_key)
let mut concatenated: vector<u8> = vector::singleton(ED25519_FLAG);
concatenated.append(public_key);
sui::address::from_bytes(hash::blake2b256(&concatenated))
}
Formula: sui_address = Blake2b256(0x00 || ed25519_public_key)
This means if you know the game server’s Ed25519 public key, you can predict its Sui address.
3.2 PersonalMessage Intent Prefix
// x"030000" is three bytes:
// 0x03 = IntentScope::PersonalMessage
// 0x00 = IntentVersion::V0
// 0x00 = AppId::Sui
let mut message_with_intent = x"030000";
message_with_intent.append(message);
let digest = hash::blake2b256(&message_with_intent);
⚠️ Important Detail: The message is directly appended (not BCS serialized), which differs from Sui wallet’s default signing behavior. The reason is that the game server’s Go/TypeScript side uses the
SignPersonalMessageapproach to directly operate on bytes.
3.3 Complete Verification Flow
pub fun verify_signature(
message: vector<u8>,
signature: vector<u8>,
expected_address: address,
): bool {
let len = signature.length();
assert!(len >= 1, EInvalidLen);
// 1. Extract scheme flag from first byte
let flag = signature[0];
// 2. Move 2024 match syntax (similar to Rust)
let (sig_len, pk_len) = match (flag) {
ED25519_FLAG => (ED25519_SIG_LEN, ED25519_PK_LEN),
_ => abort EUnsupportedScheme,
};
assert!(len == 1 + sig_len + pk_len, EInvalidLen);
// 3. Split signature bytes
let raw_sig = extract_bytes(&signature, 1, 1 + sig_len);
let raw_public_key = extract_bytes(&signature, 1 + sig_len, len);
// 4. Construct message digest with intent prefix
let mut message_with_intent = x"030000";
message_with_intent.append(message);
let digest = hash::blake2b256(&message_with_intent);
// 5. Verify public key corresponds to Sui address
let sig_address = derive_address_from_public_key(raw_public_key);
if (sig_address != expected_address) {
return false
};
// 6. Verify Ed25519 signature
match (flag) {
ED25519_FLAG => {
ed25519::ed25519_verify(&raw_sig, &raw_public_key, &digest)
},
_ => abort EUnsupportedScheme,
}
}
3.4 Byte Extraction Helper Function
// Move 2024's vector::tabulate! macro: concisely create slices
fun extract_bytes(source: &vector<u8>, start: u64, end: u64): vector<u8> {
vector::tabulate!(end - start, |i| source[start + i])
}
4. End-to-End Flow
Game Server (Go/Node.js)
│
├─ Construct message: message = bcs_encode(LocationProofMessage)
├─ Add intent prefix: msg_with_intent = 0x030000 + message
├─ Calculate digest: digest = blake2b256(msg_with_intent)
└─ Sign: signature = ed25519_sign(server_private_key, digest)
↓
Player calls contract (Sui PTB)
│
└─ verify_signature(message, flag+sig+pk, server_address)
↓
Move Contract
├─ Rebuild digest (same algorithm)
├─ Extract public_key from signature
├─ Verify address(public_key) == server_address (anti-forgery)
└─ ed25519_verify(sig, pk, digest) → true/false
The most easily overlooked aspect in this end-to-end flow is “what exactly is the signature binding to.” If the server signs something like “Player A can claim reward today” — a broad semantic — rather than “Player A can execute action=2 once for item_id=123 before deadline,” then while verification is correct, the permission boundary is still too wide. Many replay vulnerabilities and misuse vulnerabilities aren’t in the cryptographic algorithm but in the message semantics being too loose.
5. How to Use in Builder Contracts?
5.1 Basic Usage: Verifying Server-Issued Permits
module my_extension::server_permit;
use world::sig_verify;
use world::access::ServerAddressRegistry;
use std::bcs;
public struct PermitMessage has copy, drop {
player: address,
action_type: u8, // 1=pass, 2=item reward
item_id: u64,
deadline_ms: u64,
}
public fun redeem_server_permit(
server_registry: &ServerAddressRegistry,
message_bytes: vector<u8>,
signature: vector<u8>,
ctx: &mut TxContext,
) {
// 1. Deserialize message (assuming server used BCS serialization)
let msg = bcs::from_bytes<PermitMessage>(message_bytes);
// 2. Verify deadline
// (Actual implementation needs Clock, simplified here)
// 3. Verify signature from authorized server
// Get server address from registry
let server_addr = get_server_address(server_registry);
assert!(
sig_verify::verify_signature(message_bytes, signature, server_addr),
EInvalidSignature,
);
// 4. Execute business logic
assert!(msg.player == ctx.sender(), EPlayerMismatch);
// ...grant items, points, etc.
}
When actually writing Builder contracts, you should at minimum include 5 binding items: player, action_type, target object id, deadline, nonce/request_id. Missing any one could result in “signature itself is fine, but was used to do something not originally intended.” A simple principle: any field you don’t want users to replace, reuse, or delay execution should be included in the signed bytes. A well-designed permission system binds player_address, target_structure_id, target_location_hash, deadline_ms, and even business identifiers in data into an inseparable statement.
5.2 In Practice: Location Proof Verification (Preview of Ch.28 content)
verify_proximity in location.move is a typical application of sig_verify:
// world/sources/primitives/location.move
pub fun verify_proximity(
location: &Location,
proof: LocationProof,
server_registry: &ServerAddressRegistry,
clock: &Clock,
ctx: &mut TxContext,
) {
let LocationProof { message, signature } = proof;
// Step 1: Verify message fields (location hash, sender address, etc.)
validate_proof_message(&message, location, server_registry, ctx.sender());
// Step 2: BCS encode message
let message_bytes = bcs::to_bytes(&message);
// Step 3: Verify deadline not expired
assert!(is_deadline_valid(message.deadline_ms, clock), EDeadlineExpired);
// Step 4: Call sig_verify to verify signature!
assert!(
sig_verify::verify_signature(
message_bytes,
signature,
message.server_address,
),
ESignatureVerificationFailed,
)
}
6. From TypeScript to On-Chain: Complete Example
Server-Side Signing (TypeScript/Node.js)
import { Ed25519Keypair } from '@mysten/sui/keypairs/ed25519';
import { blake2b } from '@noble/hashes/blake2b';
const serverKeypair = Ed25519Keypair.fromSecretKey(SERVER_PRIVATE_KEY);
// Construct message (consistent with BCS format in Move)
const message = {
server_address: serverKeypair.getPublicKey().toSuiAddress(),
player_address: playerAddress,
// ...other fields
};
// Serialize (BCS)
const messageBytes = bcs.serialize(PermitMessage, message);
// Add PersonalMessage intent prefix
const intentPrefix = new Uint8Array([0x03, 0x00, 0x00]);
const msgWithIntent = new Uint8Array([...intentPrefix, ...messageBytes]);
// Calculate Blake2b-256 digest
const digest = blake2b(msgWithIntent, { dkLen: 32 });
// Sign with server private key
const rawSig = serverKeypair.signData(digest); // 64 bytes
// Build complete signature: flag (1) + sig (64) + pubkey (32) = 97 bytes
const pubKey = serverKeypair.getPublicKey().toRawBytes(); // 32 bytes
const fullSignature = new Uint8Array([0x00, ...rawSig, ...pubKey]);
Player Submits to On-Chain (TypeScript/PTB)
const tx = new Transaction();
tx.moveCall({
target: `${PACKAGE_ID}::my_extension::redeem_server_permit`,
arguments: [
tx.object(SERVER_REGISTRY_ID),
tx.pure(bcs.vector(bcs.u8()).serialize(Array.from(messageBytes))),
tx.pure(bcs.vector(bcs.u8()).serialize(Array.from(fullSignature))),
],
});
await client.signAndExecuteTransaction({ signer: playerKeypair, transaction: tx });
7. Match Syntax: Move 2024 New Feature
sig_verify.move extensively uses Move 2024’s match expression:
// Move 2024 match (similar to Rust)
let (sig_len, pk_len) = match (flag) {
ED25519_FLAG => (ED25519_SIG_LEN, ED25519_PK_LEN),
_ => abort EUnsupportedScheme,
};
Compared to old syntax:
// Move old syntax
let sig_len: u64;
let pk_len: u64;
if (flag == ED25519_FLAG) {
sig_len = ED25519_SIG_LEN;
pk_len = ED25519_PK_LEN;
} else {
abort EUnsupportedScheme
};
8. Security Considerations
| Risk | Protection Mechanism |
|---|---|
| Forged signature | Ed25519 cryptographic guarantee |
| Replay attack (same proof submitted repeatedly) | deadline_ms expiration time + one-time verification mark |
| Wrong server signature | derive_address_from_public_key verifies address match |
| Unregistered server | ServerAddressRegistry whitelist filtering |
9. Practice Exercises
- Signature Verification Tool: Implement a “signature generator” in TypeScript that generates pass permit signatures for players using test keys
- Single-Use Credential: Design a contract that receives server-issued “single-use items,” marks them as “used” on-chain after verification to prevent replay
- Multi-Server Support: Read the design of
ServerAddressRegistryand think about how to support multiple game server nodes signing the same credential
Chapter Summary
| Concept | Key Points |
|---|---|
| Ed25519 signature format | flag(1) + sig(64) + pubkey(32) = 97 bytes |
| PersonalMessage intent | 0x030000 prefix + message, Blake2b256 digest |
| Address verification | `Blake2b256(0x00 |
| Match syntax | Move 2024 new feature, replaces if/else branches |
tabulate! macro | Concise byte slice operations |
Next Chapter: Location Proof Protocol — BCS serialization of LocationProof, proximity verification, and how to require players to “be present” in building contracts.
Chapter 28: Location Proof Protocol Deep Dive
Learning Objective: Master the core design of the
world::locationmodule — location hashing, BCS deserialization, LocationProof verification, and complete implementation of requiring players to “be present” in Builder extensions.
Status: Teaching example. Location proof message organization and signature flow will vary by business. This chapter focuses on protocol structure and verification boundaries.
Minimal Call Chain
Game server observes location -> Generate LocationProof -> Player submits proof -> Contract deserializes and verifies -> Allow/deny business action
Corresponding Code Directory
Key Structs
| Type | Purpose | Reading Focus |
|---|---|---|
Location | On-chain location hash container | See that only hash is stored on-chain, not plaintext coordinates |
LocationProofMessage | Server-signed location proof message body | See if player, source object, target object, distance, deadline are all bound |
LocationProof | Proof payload submitted on-chain | See how bytes, signature, and message body are combined |
Key Entry Functions
| Entry | Purpose | What to Confirm |
|---|---|---|
verify_proximity | Verify “player is near target” | Whether signature, target object, distance threshold, time window are all validated |
| BCS deserialization path | Restore proof from bytes | Whether field order matches off-chain encoding exactly |
| Business module wrapper entry | Connect proximity proof to Gate / Turret / Storage | Whether proof is bound to specific business object rather than generic reuse |
Most Easily Misunderstood Points
- Location proof doesn’t just prove “I am present,” but proves “I am near a certain object, within a certain time window”
- Only checking distance without checking target object allows proof to be misused across different business entries
- BCS field order mismatch usually isn’t a cryptography issue but an encoding issue
Location proof is best understood as a protocol layer, not as “a signature object.” It has at least 4 layers of meaning: who is present, relative to what, within what time window, and what other business context is bound. Truly secure Builder designs don’t just check the distance field alone, but bind player_address, target_structure_id, target_location_hash, deadline_ms, and even business identifiers in data into an inseparable statement.
1. Core Problem of Location System
EVE Frontier’s on-chain contracts face a fundamental challenge: How to verify a player (ship) is currently near a certain spatial location?
On-chain contracts cannot access real-time game world location data. EVE Frontier’s solution is LocationProof:
Game server observes "Player A is near Building B (distance < 1000m)"
↓
Server signs this "observed fact" into a LocationProof
↓
Player A submits this proof to on-chain contract
↓
Contract verifies signature, location hash, expiration time then executes business logic
2. LocationProof Data Structure
// world/sources/primitives/location.move
/// Location hash (32 bytes, mixed hash containing x/y/z coordinates)
public struct Location has store {
location_hash: vector<u8>, // 32 bytes
}
/// Server-signed location proof message body
public struct LocationProofMessage has copy, drop {
server_address: address, // Signer (server address)
player_address: address, // Player wallet address being proven
source_structure_id: ID, // ID of structure player is at
source_location_hash: vector<u8>, // Hash of player's location
target_structure_id: ID, // Target building's ID
target_location_hash: vector<u8>, // Hash of target's location
distance: u64, // Distance between them (game units)
data: vector<u8>, // Stores additional business data
deadline_ms: u64, // Proof expiration time (milliseconds)
}
/// Complete location proof (message body + signature)
public struct LocationProof has drop {
message: LocationProofMessage,
signature: vector<u8>,
}
The most noteworthy field here is actually data. Its existence isn’t to “add more notes,” but to reserve extension binding positions for different businesses. For example, a treasure chest system can write chest type or opening round into it, a market system can write market_id or order context into it. This way a proof isn’t just “I am at a location,” but “I am at a location, and this proof is for a specific business entry.” If this binding layer is abandoned, proofs can easily be misused across multiple entries.
3. Complete Analysis of verify_proximity Function
pub fun verify_proximity(
location: &Location, // Target building's on-chain location object
proof: LocationProof, // Player-submitted proof
server_registry: &ServerAddressRegistry, // Authorized server whitelist
clock: &Clock,
ctx: &mut TxContext,
) {
let LocationProof { message, signature } = proof;
// ① Validate message field validity
validate_proof_message(&message, location, server_registry, ctx.sender());
// ② Serialize message struct to bytes (BCS format)
let message_bytes = bcs::to_bytes(&message);
// ③ Verify deadline not expired
assert!(is_deadline_valid(message.deadline_ms, clock), EDeadlineExpired);
// ④ Call sig_verify to verify Ed25519 signature
assert!(
sig_verify::verify_signature(
message_bytes,
signature,
message.server_address,
),
ESignatureVerificationFailed,
)
}
validate_proof_message Internal Verification
fun validate_proof_message(
message: &LocationProofMessage,
expected_location: &Location,
server_registry: &ServerAddressRegistry,
sender: address,
) {
// 1. Server address is in whitelist
assert!(
access::is_authorized_server_address(server_registry, message.server_address),
EUnauthorizedServer,
);
// 2. Player address in message matches caller (prevent others using your proof)
assert!(message.player_address == sender, EUnverifiedSender);
// 3. Target location hash matches on-chain Location object
assert!(
message.target_location_hash == expected_location.location_hash,
EInvalidLocationHash,
);
}
Triple verification ensures security:
- ✅ Signature from authorized server
- ✅ Proof issued for current caller (prevent front-running)
- ✅ Target location matches on-chain object’s location (prevent tampering)
These three verifications solve basic identity and target binding, but Builders often need a fourth verification: business binding. For example, “opening this door” and “opening that chest” even if both are near the same coordinates, should not share the same proof. The safest approach is to make the data or target object field uniquely point to this business entry, rather than relying only on spatial proximity.
4. BCS Deserialization: Restoring LocationProof from Bytes
When players submit proof_bytes (raw bytes) via SDK rather than a struct, the contract needs manual deserialization:
pub fun verify_proximity_proof_from_bytes(
server_registry: &ServerAddressRegistry,
location: &Location,
proof_bytes: vector<u8>,
clock: &Clock,
ctx: &mut TxContext,
) {
// Manual BCS deserialization
let (message, signature) = unpack_proof(proof_bytes);
// ...(same as verify_proximity afterwards)
}
unpack_proof’s BCS Manual Deserialization
fun unpack_proof(proof_bytes: vector<u8>): (LocationProofMessage, vector<u8>) {
let mut bcs_data = bcs::new(proof_bytes);
// "Peel" fields in BCS field order
let server_address = bcs_data.peel_address();
let player_address = bcs_data.peel_address();
// ID type restored via address
let source_structure_id = object::id_from_address(bcs_data.peel_address());
// vector<u8> type uses peel_vec! macro
let source_location_hash = bcs_data.peel_vec!(|bcs| bcs.peel_u8());
let target_structure_id = object::id_from_address(bcs_data.peel_address());
let target_location_hash = bcs_data.peel_vec!(|bcs| bcs.peel_u8());
let distance = bcs_data.peel_u64();
let data = bcs_data.peel_vec!(|bcs| bcs.peel_u8());
let deadline_ms = bcs_data.peel_u64();
let signature = bcs_data.peel_vec!(|bcs| bcs.peel_u8());
let message = LocationProofMessage {
server_address, player_address, source_structure_id,
source_location_hash, target_structure_id, target_location_hash,
distance, data, deadline_ms,
};
(message, signature)
}
peel_vec!macro: Standard way to handle BCS-encodedvector<u8>in Move 2024, equivalent to reading length first, then reading bytes one by one.
5. Distance Verification
Besides “whether nearby,” also supports “whether distance between two structures meets requirements”:
pub fun verify_distance(
location: &Location,
server_registry: &ServerAddressRegistry,
proof_bytes: vector<u8>,
max_distance: u64, // Builder-set maximum distance threshold
ctx: &mut TxContext,
) {
let (message, signature) = unpack_proof(proof_bytes);
validate_proof_message(&message, location, server_registry, ctx.sender());
let message_bytes = bcs::to_bytes(&message);
// Verify distance doesn't exceed Builder-set threshold
assert!(message.distance <= max_distance, EOutOfRange);
assert!(
sig_verify::verify_signature(message_bytes, signature, message.server_address),
ESignatureVerificationFailed,
)
}
Same Location Verification (No Signature Needed)
/// Verify two temporary inventories are at same location (for EVE space P2P trading)
pub fun verify_same_location(location_a_hash: vector<u8>, location_b_hash: vector<u8>) {
assert!(location_a_hash == location_b_hash, ENotInProximity);
}
6. Builder Practice: Space-Restricted Trading Market
module my_market::space_market;
use world::location::{Self, Location, LocationProof};
use world::access::ServerAddressRegistry;
use sui::clock::Clock;
/// Only players near market can purchase
pub fun buy_item(
market: &mut Market,
market_location: &Location, // Market's on-chain location object
proximity_proof: LocationProof, // Player-submitted location proof
server_registry: &ServerAddressRegistry,
payment: Coin<SUI>,
item_id: u64,
clock: &Clock,
ctx: &mut TxContext,
) {
// Verify player is near market (core guard)
location::verify_proximity(
market_location,
proximity_proof,
server_registry,
clock,
ctx,
);
// Subsequent business logic
// ...
}
7. Builder Practice: Location-Locked Treasure Chest
module my_treasure::chest;
use world::location::{Self, Location};
use world::access::ServerAddressRegistry;
/// Can only open chest when at chest location
pub fun open_chest(
chest: &mut TreasureChest,
chest_location: &Location,
proximity_proof_bytes: vector<u8>,
server_registry: &ServerAddressRegistry,
clock: &Clock,
ctx: &mut TxContext,
) {
// Use bytes interface (server passes bytes directly, no need to construct struct in PTB)
location::verify_proximity_proof_from_bytes(
server_registry,
chest_location,
proximity_proof_bytes,
clock,
ctx,
);
// Open chest!
let loot = chest.claim_loot(ctx);
transfer::public_transfer(loot, ctx.sender());
}
8. Location Proof Expiration Mechanism
fun is_deadline_valid(deadline_ms: u64, clock: &Clock): bool {
let current_time_ms = clock.timestamp_ms();
deadline_ms > current_time_ms
}
Game servers typically set 30 seconds to 5 minutes validity period for location proofs. After expiration, players need to request a new proof from the server.
Design Recommendation:
- One-time actions (like opening chest): Set 30 second validity
- Continuous actions (like mining session): Set 5 minute validity, refresh periodically
Expiration time essentially balances two things: security window and interaction cost. Window too long, risk of proof being intercepted or player delaying use increases; window too short, network jitter, wallet confirmation delay, sponsored transaction queuing become false negatives. When designing as a Builder, don’t just ask “theoretically how short is safest,” but also look at how long it typically takes from server signature to on-chain finalization in real transaction paths.
9. Special Handling for Testing
Since test environments cannot run real game server signatures, world-contracts provides test versions without deadline verification:
#[test_only]
pub fun verify_proximity_without_deadline(
server_registry: &ServerAddressRegistry,
location: &Location,
proof: LocationProof,
ctx: &mut TxContext,
): bool {
let LocationProof { message, signature } = proof;
validate_proof_message(&message, location, server_registry, ctx.sender());
let message_bytes = bcs::to_bytes(&message);
sig_verify::verify_signature(message_bytes, signature, message.server_address)
}
In tests, you can pre-generate a fixed “never expires” signature, bypassing time checks.
Chapter Summary
| Concept | Key Points |
|---|---|
Location | 32-byte hash, maintained by game server |
LocationProof | Message body + Ed25519 signature, limited validity |
| Triple verification | Server whitelist + player address match + location hash match |
verify_distance | Supports verification of upper limit on distance between two buildings |
| BCS peel manual deserialization | Field order must match struct definition |
Next Chapter: Energy and Fuel System — Deep dive into EVE Frontier’s dual-layer energy mechanism for building operations, and precise calculation logic for fuel consumption rates.
Chapter 29: Energy and Fuel System Mechanics
Learning Objective: Deeply understand EVE Frontier’s dual-layer energy mechanism for building operations — Energy (power capacity) and Fuel (fuel consumption), master the source code design of
world::energyandworld::fuelmodules, and learn to write Builder extensions that interact with these two systems.
Status: Teaching example. The energy/fuel models in the text help you understand official implementations; refer to actual modules for fields and entries when implementing.
Minimal Call Chain
Network Node allocates energy -> Building checks energy/fuel conditions -> Business module consumes fuel -> Building state updates
Corresponding Code Directory
Key Structs
| Type | Purpose | Reading Focus |
|---|---|---|
EnergyConfig | Energy configuration for different assembly types | How type-to-energy requirement mapping is maintained |
EnergySource | Network node’s power supply state | Relationship between max output, current output, reserved energy |
Fuel related structures | Building fuel inventory and consumption state | How fuel inventory and time rate are bound |
FuelEfficiency | Fuel type and efficiency differences | How different fuels affect runtime and cost |
Key Entry Functions
| Entry | Purpose | What to Confirm |
|---|---|---|
available_energy | Calculate remaining available energy | Whether current output and reserved amount are updated synchronously |
| Fuel consumption entry | Deduct fuel when business executes | Whether fuel deduction is bound in same transaction as business action |
| Building online/offline path | Judge state combining energy + fuel | Whether both condition sets are satisfied |
Most Easily Misunderstood Points
Energyis more like capacity/quota, not “wallet balance that can be slowly spent”- Only replenishing fuel without replenishing energy can still cause building to go offline
- State judgment must be in same transaction as resource deduction, otherwise frontend easily reads stale state
The most important understanding in this chapter isn’t memorizing field names, but distinguishing capacity constraint and consumption constraint. Energy answers “does this building have the right to run on this power grid”; Fuel answers “how long can it maintain right now”. The former is more like concurrency quota, the latter more like time ledger. Mixing these two into one balance model makes Builders prone to errors when designing online status, warning logic, and supply systems.
1. Why a Dual-Layer Energy System?
EVE Frontier’s buildings (SmartAssembly) need to manage two different types of “resources” simultaneously:
| Concept | Corresponding Module | Nature | Analogy |
|---|---|---|---|
| Energy | world::energy | Power/capacity, continuously available | Grid capacity (KW) |
| Fuel | world::fuel | Consumable, has inventory | Generator’s fuel oil (liters) |
- Building networking (NetworkNode) allocates certain energy capacity to each connected building
- Buildings themselves need to continuously burn fuel to maintain operation
From a Builder perspective, this means many “offline” cases actually have two completely different root causes: one is no grid capacity, another is no fuel. They both manifest to player experience as “building can’t be used,” but product actions are different. Capacity shortage often requires network topology, building connection order, or upgrade decisions; fuel shortage is more like supply, charging, agency operation problems. Separating these two diagnostic surfaces makes subsequent warning and charging systems clearer.
2. Energy Module
2.1 Core Data Structure
// world/sources/primitives/energy.move
pub struct EnergyConfig has key {
id: UID,
// type_id → energy value required for this assembly type
assembly_energy: Table<u64, u64>,
}
pub struct EnergySource has store {
max_energy_production: u64, // Max power generation (NetworkNode's energy ceiling)
current_energy_production: u64, // Currently activated power generation
total_reserved_energy: u64, // Total energy reserved by buildings
}
2.2 Energy Calculation Formula
/// Available energy = current production - reserved energy
pub fun available_energy(energy_source: &EnergySource): u64 {
if (energy_source.current_energy_production > energy_source.total_reserved_energy) {
energy_source.current_energy_production - energy_source.total_reserved_energy
} else {
0 // Cannot be negative
}
}
2.3 Energy Reservation and Release
When a building (like Gate or Turret) joins NetworkNode:
// Internal package function (Builder doesn't call directly)
pub(package) fun reserve(
energy_source: &mut EnergySource,
energy_source_id: ID,
assembly_type_id: u64, // Building type to connect
energy_config: &EnergyConfig, // Read energy required for this type
ctx: &TxContext,
) {
let energy_required = energy_config.assembly_energy(assembly_type_id);
assert!(energy_source.available_energy() >= energy_required, EInsufficientAvailableEnergy);
energy_source.total_reserved_energy = energy_source.total_reserved_energy + energy_required;
event::emit(EnergyReservedEvent { ... });
}
2.4 EnergyConfig Configuration (Admin Only)
pub fun set_energy_config(
energy_config: &mut EnergyConfig,
admin_acl: &AdminACL,
assembly_type_id: u64,
energy_required: u64, // How much energy this building type requires
) {
admin_acl.verify_sponsor(ctx);
if (energy_config.assembly_energy.contains(assembly_type_id)) {
*energy_config.assembly_energy.borrow_mut(assembly_type_id) = energy_required;
} else {
energy_config.assembly_energy.add(assembly_type_id, energy_required);
};
}
3. Fuel Module (Focus: Time Rate Calculation)
3.1 Core Data Structure
// world/sources/primitives/fuel.move
pub struct FuelConfig has key {
id: UID,
// fuel_type_id → efficiency multiplier (BPS, 10000 = 100%)
fuel_efficiency: Table<u64, u64>,
}
public struct Fuel has store {
type_id: Option<u64>, // Currently filled fuel type
quantity: u64, // Remaining fuel quantity
max_capacity: u64, // Fuel tank maximum capacity
burn_rate_in_ms: u64, // Base burn rate (ms/unit)
is_burning: bool, // Whether currently burning
burn_start_time: u64, // Last burn start timestamp
previous_cycle_elapsed_time: u64, // Previous cycle's remaining time (prevent precision loss)
last_updated: u64, // Last update time
}
3.2 Burn Cycle Calculation (Deep Dive)
This is the most complex part of the Fuel module:
fun calculate_units_to_consume(
fuel: &Fuel,
fuel_config: &FuelConfig,
current_time_ms: u64,
): (u64, u64) { // Returns: (consumed units, remaining milliseconds)
if (!fuel.is_burning || fuel.burn_start_time == 0) {
return (0, 0)
};
// 1. Read efficiency for this fuel type from FuelConfig
let fuel_type_id = *option::borrow(&fuel.type_id);
let fuel_efficiency = fuel_config.fuel_efficiency.borrow(fuel_type_id);
// 2. Actual consumption rate = base rate × efficiency coefficient
let actual_consumption_rate_ms =
(fuel.burn_rate_in_ms * fuel_efficiency) / PERCENTAGE_DIVISOR;
// Example: burn_rate=3600000ms(1hr/unit), efficiency=5000(50%)
// Actual per unit = 3600000 * 5000 / 10000 = 1800000ms (30 minutes)
// 3. Calculate total elapsed time (including previous cycle's remaining time)
let elapsed_ms = if (current_time_ms > fuel.burn_start_time) {
current_time_ms - fuel.burn_start_time
} else { 0 };
// Keep previous cycle's "fractional" time to avoid precision loss
let total_elapsed_ms = elapsed_ms + fuel.previous_cycle_elapsed_time;
// 4. Integer division to get consumed units
let units_to_consume = total_elapsed_ms / actual_consumption_rate_ms;
// 5. Remainder becomes next cycle's start time
let remaining_elapsed_ms = total_elapsed_ms % actual_consumption_rate_ms;
(units_to_consume, remaining_elapsed_ms)
}
Why previous_cycle_elapsed_time is needed?
This design addresses a common difficulty in "on-chain timed billing": you can't tick every second like a game server, only settle elapsed time in discrete transactions. So `previous_cycle_elapsed_time` actually saves the time remainder from last settlement that couldn't be fully divided. Without it, each settlement would round down, systematically under-charging fuel over time, eventually draining the economic model.
Timeline example (burn_rate = 1 hour/unit):
│───────────────────────────────────────────────────│
0 60min 90min 120min
First update (at 90min):
elapsed = 90min
units = 90min / 60min = 1 unit consumed
remaining = 90min % 60min = 30min ← Saved to previous_cycle_elapsed_time
Second update (at 120min):
elapsed = 30min (from last burn_start_time)
total = 30min + 30min(previous) = 60min
units = 60min / 60min = 1 unit consumed
remaining = 0
3.3 update Function: Batch Settlement
/// Game server periodically calls this function to settle fuel consumption
pub(package) fun update(
fuel: &mut Fuel,
assembly_id: ID,
assembly_key: TenantItemId,
fuel_config: &FuelConfig,
clock: &Clock,
) {
// Not burning → return directly
if (!fuel.is_burning || fuel.burn_start_time == 0) { return };
let current_time_ms = clock.timestamp_ms();
if (fuel.last_updated == current_time_ms) { return }; // Idempotent within same block
let (units_to_consume, remaining_elapsed_ms) =
calculate_units_to_consume(fuel, fuel_config, current_time_ms);
if (fuel.quantity >= units_to_consume) {
// Enough fuel: consume normally
consume_fuel_units(fuel, ..., units_to_consume, remaining_elapsed_ms, current_time_ms);
fuel.last_updated = current_time_ms;
} else {
// Fuel depleted: automatically stop burning
stop_burning(fuel, assembly_id, assembly_key, fuel_config, clock);
}
}
3.4 A Known Bug (Source Code Comment)
pub(package) fun start_burning(fuel: &mut Fuel, ...) {
// ...
if (fuel.quantity != 0) {
// todo : fix bug: consider previous cycle elapsed time
fuel.quantity = fuel.quantity - 1; // Consume 1 unit to start the clock
};
Starting burn directly deducts 1 unit, but doesn’t consider previous_cycle_elapsed_time which may cause this unit to be double-counted. This is a clearly commented known bug in source code. Learning point: even production contracts have bugs; read source code with critical thinking.
4. How Do Builders Sense Fuel Status?
Builder extensions typically don’t directly manipulate Fuel objects (it’s pub(package) internal field), but can indirectly judge through building status:
use world::assemblies::gate::{Self, Gate};
use world::status;
/// Check if Gate is operational (indirectly reflects fuel status)
pub fun is_gate_operational(gate: &Gate): bool {
gate.status().is_online()
}
When fuel depletes, game server calls stop_burning, then building’s Status changes to Offline, Builder contracts sense through Status:
// Only online buildings can process jump requests
assert!(source_gate.status.is_online(), ENotOnline);
This is also an important boundary: World kernel hides fuel details within package, not to limit Builders, but to prevent extensions from directly tampering with underlying billing state. Builders are better suited to build product-layer logic around “whether online,” “whether supply sufficient,” “whether need reminder/charge/donation,” rather than inventing another fuel ledger.
5. Energy vs Fuel State Flow
Fuel State Machine:
EMPTY
│ deposit_fuel()
▼
LOADED
│ start_burning()
▼
BURNING ──── update() ────► Fuel sufficient continue BURNING
│ │
│ ▼ Fuel depleted
│ OFFLINE (building offline)
│ stop_burning()
▼
STOPPED (preserves previous_cycle_elapsed_time)
Energy State Machine (simpler):
OFF
│ start_energy_production()
▼
ON (continuously provides max_energy_production capacity)
│ stop_energy_production()
▼
OFF
6. FuelEfficiency Design: Supporting Multiple Fuel Types
pub struct FuelConfig has key {
id: UID,
fuel_efficiency: Table<u64, u64>, // fuel_type_id → efficiency_bps
}
Different fuel types (different type_id) have different efficiencies:
| fuel_type_id | Fuel Name | efficiency_bps | Description |
|---|---|---|---|
| 1001 | Standard Fuel | 10000 (100%) | Baseline efficiency |
| 1002 | High-Efficiency Fuel | 15000 (150%) | Burns longer |
| 1003 | Common Fuel Rod | 8000 (80%) | Cheap but inefficient |
Higher efficiency means same fuel quantity can maintain building operation longer. Builders can require players to use specific fuel types in extensions.
7. Practice Exercises
- Fuel Calculator: Given
burn_rate_in_ms = 3600000,fuel_efficiency = 7500, remainingquantity = 10, calculate how many hours it can still run - Fuel Warning Contract: Write a Builder extension that automatically sends an on-chain event reminder to owner when Gate’s fuel remaining is less than 5 units
- Fuel Donation System: Design a shared
FuelDonationPoolallowing any player to donate fuel to buildings
Chapter Summary
| Concept | Key Points |
|---|---|
EnergySource | Power capacity system, reserve/release mode |
Fuel | Consumable system, time-based burn cycles |
previous_cycle_elapsed_time | Prevents precision loss from time rounding |
fuel_efficiency | Efficiency multiplier for different fuel types (BPS) |
| Known Bug | start_burning’s 1 unit deduction doesn’t consider prior remaining time |
Next Chapter: Extension Pattern in Practice — Using two real examples from official
extension_examples, master standard development flow for Builder extensions.
Chapter 30: Extension Pattern in Practice — Official Example Deep Dive
Learning Objective: Master EVE Frontier Builder extension standard development patterns through deep dive of two real official extension examples in
world-contracts/contracts/extension_examples/.
Status: Mapped to official example directory. Text is structured explanation; recommend reading while opening extension example source code.
Minimal Call Chain
authorize_extension<XAuth> -> Write ExtensionConfig -> Business entry validates rules -> Call World Assembly API
Corresponding Code Directory
Key Structs
| Type | Purpose | Reading Focus |
|---|---|---|
AdminCap | Management capability for configuring extension rules | Who can write config, who can only read config |
XAuth / witness type | Binds extension authorization identity | How witness type becomes extension switch |
| Config object / dynamic field key | Stores extension rules | Whether rule key matches business entry reads |
Key Entry Functions
| Entry | Purpose | What to Confirm |
|---|---|---|
authorize_extension<XAuth> | Authorize witness to World building | Whether authorization type matches extension package type exactly |
| Config write entry | Initialize tribe / bounty rules | Whether write key and read key match |
| Extension business entry | Actually execute business rules | Whether only reads own config, doesn’t assume World kernel is modified |
Most Easily Misunderstood Points
- Extension pattern isn’t “modifying World contract source code,” but hooking behavior through witness and config objects
- Successful authorization doesn’t mean business will run; inconsistent config keys still can’t read rules
- Once witness type is wrong, problem usually isn’t in logic but in authorization chain itself
Extension pattern’s real power isn’t “can insert custom code,” but it controls extension capability within a very clear boundary: World continues to control core assets and core state, Builder only rewrites rules at allowed facets. This makes EVE’s extensions more like constrained composition rather than arbitrary monkey patching. You can change who can pass gates, what to pay before passing, what configs to satisfy, but can’t secretly rewrite Gate’s underlying ownership and world rules.
1. What is Extension Pattern?
EVE Frontier’s Builder extension system allows any developer to modify game building behavior (Gate, Turret, StorageUnit, etc.) without modifying World contract itself.
Core design: Typed Witness Authorization Pattern
World Contract Builder Extension Package
───────────── ─────────────
Gate has key { pub struct XAuth {}
extension: Option<TypeName> ←──── gate::authorize_extension<XAuth>()
}
When gate activates XAuth, game engine
calls extension functions in XAuth's package
2. Official Example Overview
extension_examples contains two typical examples:
| Example File | Function | Authorization Type |
|---|---|---|
tribe_permit.move | Only allow specific tribe characters to use gate | Identity filtering |
corpse_gate_bounty.move | Submit corpse as “toll” to use gate | Item consumption |
Both rely on shared config framework: config.move
3. Shared Config Framework: config.move
module extension_examples::config;
use sui::dynamic_field as df;
/// Admin capability
public struct AdminCap has key, store { id: UID }
/// Extension's authorization witness type (Typed Witness)
public struct XAuth has drop {}
/// Extension config shared object (uses dynamic fields to store various rules)
public struct ExtensionConfig has key {
id: UID,
admin: address,
}
/// Dynamic field operations: add/update rules
public fun set_rule<K: copy + drop + store, V: store>(
config: &mut ExtensionConfig,
_: &AdminCap, // Only AdminCap holders can set rules
key: K,
value: V,
) {
if (df::exists_(&config.id, key)) {
df::remove<K, V>(&mut config.id, key);
};
df::add(&mut config.id, key, value);
}
/// Check if rule exists
pub fun has_rule<K: copy + drop + store>(config: &ExtensionConfig, key: K): bool {
df::exists_(&config.id, key)
}
/// Read rule
pub fun borrow_rule<K: copy + drop + store, V: store>(
config: &ExtensionConfig,
key: K,
): &V {
df::borrow(&config.id, key)
}
/// Get XAuth instance (only callable within package)
pub(package) fun x_auth(): XAuth { XAuth {} }
Design Highlight: ExtensionConfig uses dynamic fields to store different types of “rules,” each rule has its own Key type (like TribeConfigKey, BountyConfigKey), don’t interfere with each other, can be combined arbitrarily.
This is why both dynamic field and typed witness are used here. Dynamic field solves “how rules are stored and extended,” typed witness solves “who is qualified to trigger this rule set.” Former is data-facing, latter is permission-facing. Many beginners writing extensions first time only focus on building config tables, but forget the most critical authorization chain, final manifestation is configs are there, code compiles, but World doesn’t recognize this extension identity at all.
4. Example One: Tribe Permit (tribe_permit.move)
Function
Only characters belonging to a specific tribe can pass through this Gate.
Core Structure
module extension_examples::tribe_permit;
/// Dynamic field Key
public struct TribeConfigKey has copy, drop, store {}
/// Dynamic field Value
public struct TribeConfig has drop, store {
tribe: u32, // Allowed tribe ID
}
Issue Permit (Core Logic)
pub fun issue_jump_permit(
extension_config: &ExtensionConfig,
source_gate: &Gate,
destination_gate: &Gate,
character: &Character,
_: &AdminCap, // Requires AdminCap (prevent abuse)
clock: &Clock,
ctx: &mut TxContext,
) {
// 1. Read tribe config
assert!(extension_config.has_rule<TribeConfigKey>(TribeConfigKey {}), ENoTribeConfig);
let tribe_cfg = extension_config.borrow_rule<TribeConfigKey, TribeConfig>(TribeConfigKey {});
// 2. Verify character tribe
assert!(character.tribe() == tribe_cfg.tribe, ENotStarterTribe);
// 3. 5-day validity period (in milliseconds)
let expires_at_timestamp_ms = clock.timestamp_ms() + 5 * 24 * 60 * 60 * 1000;
// 4. Call world::gate to issue JumpPermit NFT
gate::issue_jump_permit<XAuth>( // Use XAuth as witness
source_gate,
destination_gate,
character,
config::x_auth(), // Get witness instance
expires_at_timestamp_ms,
ctx,
);
}
Admin Configuration
pub fun set_tribe_config(
extension_config: &mut ExtensionConfig,
admin_cap: &AdminCap,
tribe: u32,
) {
extension_config.set_rule<TribeConfigKey, TribeConfig>(
admin_cap,
TribeConfigKey {},
TribeConfig { tribe },
);
}
5. Example Two: Corpse Bounty Gate (corpse_gate_bounty.move)
Function
Player must deposit a specific type of “corpse item” from inventory into Builder’s StorageUnit to get permission to pass Gate.
Complete Flow
pub fun collect_corpse_bounty<T: key + store>(
extension_config: &ExtensionConfig,
storage_unit: &mut StorageUnit, // Builder's item storage
source_gate: &Gate,
destination_gate: &Gate,
character: &Character, // Player character
player_inventory_owner_cap: &OwnerCap<T>, // Player's item ownership credential
corpse_item_id: u64, // Corpse item_id to submit
clock: &Clock,
ctx: &mut TxContext,
) {
// 1. Read bounty config (what type of corpse needed)
assert!(extension_config.has_rule<BountyConfigKey>(BountyConfigKey {}), ENoBountyConfig);
let bounty_cfg = extension_config.borrow_rule<BountyConfigKey, BountyConfig>(BountyConfigKey {});
// 2. Withdraw corpse item from player inventory
// OwnerCap<T> proves player has authority to operate this item
let corpse = storage_unit.withdraw_by_owner<T>(
character,
player_inventory_owner_cap,
corpse_item_id,
1, // Quantity
ctx,
);
// 3. Verify corpse type matches bounty requirement
assert!(corpse.type_id() == bounty_cfg.bounty_type_id, ECorpseTypeMismatch);
// 4. Deposit corpse into Builder's StorageUnit (as "collection")
storage_unit.deposit_item<XAuth>(
character,
corpse,
config::x_auth(),
ctx,
);
// 5. Issue JumpPermit with 5-day validity
let expires_at_timestamp_ms = clock.timestamp_ms() + 5 * 24 * 60 * 60 * 1000;
gate::issue_jump_permit<XAuth>(
source_gate, destination_gate, character,
config::x_auth(), expires_at_timestamp_ms, ctx,
);
}
6. Comparison of Two Patterns
tribe_permit (identity verification):
Player → [Provide Character object] → Verify tribe_id → Issue JumpPermit
corpse_gate_bounty (item consumption):
Player → [Provide corpse item] → Transfer to Builder → Issue JumpPermit
| Attribute | tribe_permit | corpse_gate_bounty |
|---|---|---|
| Verification method | Character attribute | Item ownership |
| Resource consumption | None (permit has time limit) | Consumes one corpse item |
| Reusable | Yes (each time needs AdminCap signing) | Each time needs item consumption |
| Application scenario | Social gating (alliance exclusive) | Economic incentive (bounty hunter) |
These two official examples actually correspond to Builders’ two most common extension approaches: identity filtering and resource exchange. Former focuses on “who you are,” latter focuses on “what you bring to exchange.” Once you understand these two parent patterns, many other gameplay are just variants, like whitelist markets, loot exchange, quest tickets, membership privileges, consumable activation, etc., can all be combined along these two paths.
7. Builder Development Checklist
Based on two official examples, steps to develop a standard Extension:
1. Define XAuth witness type (one per extension package)
2. Create ExtensionConfig shared object
3. Create AdminCap (for managing config)
4. Define rule structs (XxxConfig) and corresponding Key types (XxxConfigKey)
5. Implement management functions: set_xxx_config (requires AdminCap)
6. Implement core logic: check rules → business logic → call gate::issue_jump_permit<XAuth>
7. In init() create and transfer ExtensionConfig and AdminCap
When actually implementing, recommend checking one more thing: after extension fails, is World’s core state still safe. A good extension even if can’t read config, permissions don’t match, payment insufficient, should only abort this business, not leave Gate, StorageUnit, Character in half-completed state. This is also why World tightly controls core asset operation access, trying to keep failure rollback within extension boundaries.
8. My First Extension: Toll Gate
module my_toll::paid_gate;
use my_toll::config::{Self, AdminCap, XAuth, ExtensionConfig};
use world::{character::Character, gate::{Self, Gate}};
use sui::{coin::{Self, Coin}, sui::SUI, balance::{Self, Balance}};
use sui::clock::Clock;
public struct TollConfigKey has copy, drop, store {}
public struct TollConfig has drop, store { toll_amount: u64 }
public struct TollVault has key {
id: UID,
balance: Balance<SUI>,
}
public fun pay_toll_and_jump(
extension_config: &ExtensionConfig,
vault: &mut TollVault,
source_gate: &Gate,
destination_gate: &Gate,
character: &Character,
mut payment: Coin<SUI>,
clock: &Clock,
ctx: &mut TxContext,
) {
let toll_cfg = extension_config.borrow_rule<TollConfigKey, TollConfig>(TollConfigKey {});
assert!(coin::value(&payment) >= toll_cfg.toll_amount, 0);
let toll = coin::split(&mut payment, toll_cfg.toll_amount, ctx);
balance::join(&mut vault.balance, coin::into_balance(toll));
if (coin::value(&payment) > 0) {
transfer::public_transfer(payment, ctx.sender());
} else {
coin::destroy_zero(payment);
};
let expires = clock.timestamp_ms() + 60 * 60 * 1000; // 1 hour pass
gate::issue_jump_permit<XAuth>(
source_gate, destination_gate, character,
config::x_auth(), expires, ctx,
);
}
Chapter Summary
| Concept | Key Points |
|---|---|
Typed Witness (XAuth) | Each extension package’s unique authorization credential, passed into gate::issue_jump_permit<XAuth> |
ExtensionConfig | Uses dynamic fields to store extensible rules, supports arbitrary rule type combinations |
TribeConfigKey/BountyConfigKey | Identifying Keys for different rules, avoid type collision |
AdminCap | Controls who can modify extension config |
OwnerCap<T> | Player item operation authorization credential |
Next Chapter: Turret AI Extension Development — Analyzing target priority queue system through
world::turret, developing custom turret AI extensions.
Chapter 31: Turret AI Extension Development
Learning Objective: Deeply understand the target priority system in
world::turretmodule, master complete implementation methods for customizing turret AI behavior through Extension pattern.
Status: Teaching example. Text focuses on priority model and extension entry points; specific fields should still refer to official
turretmodule source code.
Minimal Call Chain
Ship enters range/triggers aggression -> turret module collects candidate targets -> extension rules sort -> execute attack decision
Corresponding Code Directory
Key Structs
| Type | Purpose | Reading Focus |
|---|---|---|
TargetCandidate | Turret decision input candidate set | Which fields participate in filtering, which fields participate in sorting |
ReturnTargetPriorityList | Extension-returned priority result | Does extension return “sort suggestion” or “direct fire command” |
BehaviourChangeReason | Reason triggering this recalculation | Does AI refresh come from entering range, attack behavior, or state change |
OnlineReceipt | Turret online status related credential | Whether extension logic depends on online prerequisite condition |
Key Entry Functions
| Entry | Purpose | What to Confirm |
|---|---|---|
| Turret candidate set calculation path | Collect attackable targets | Whether filter conditions come before sorting |
| Extension priority entry | Custom AI sorting rules | Whether return value meets World side expectations |
| Authorization and online entry | Hook extension to turret | Whether extension is truly enabled and state synced |
Most Easily Misunderstood Points
- Turret AI extension point is usually “sorting,” not bypassing kernel to directly take over firing
- Only changing priority without changing filter conditions, turret may still attack wrong targets
- Candidate target fields come from game events and kernel state, shouldn’t fabricate based on frontend or off-chain cache
This chapter first needs to distinguish two things: who qualifies to be a candidate target, and among candidate targets who ranks first. Former is a filtering problem, determining whether target enters candidate set; latter is a sorting problem, determining who to shoot first. Most Builder AI extensions can safely influence the latter, not completely overturn the former. This design separates “world rules” from “local strategies,” preventing one extension package from directly turning turret into any weapon it wants.
1. What is a Turret?
Smart Turret is a programmable space building in EVE Frontier that can automatically fire at ships entering its range.
Two key behavior trigger points:
| Trigger | Description |
|---|---|
InProximity | Ship enters turret range |
Aggression | Ship starts/stops attacking own buildings |
Default behavior: Attack all ships entering range.
Builder extension capability: Customize target priority sorting — determine which targets turret attacks first.
2. TargetCandidate Data Structure
When game engine needs to decide who turret should shoot, it constructs a batch of TargetCandidate and passes into extension function:
// world/sources/assemblies/turret.move
pub struct TargetCandidate has copy, drop, store {
item_id: u64, // Target's in-game ID (ship/NPC)
type_id: u64, // Target type
group_id: u64, // Target's group (0=NPC)
character_id: u32, // Pilot's character ID (NPC is 0)
character_tribe: u32, // Pilot tribe (NPC is 0)
hp_ratio: u64, // Remaining health percentage (0-100)
shield_ratio: u64, // Remaining shield percentage (0-100)
armor_ratio: u64, // Remaining armor percentage (0-100)
is_aggressor: bool, // Whether attacking building
priority_weight: u64, // Priority weight (larger is higher priority)
behaviour_change: BehaviourChangeReason, // Reason triggering this update
}
Trigger Reason Enum
pub enum BehaviourChangeReason has copy, drop, store {
UNSPECIFIED,
ENTERED, // Ship entered turret range
STARTED_ATTACK, // Ship started attacking
STOPPED_ATTACK, // Ship stopped attacking
}
Important Design: Each call, each target candidate has only one most relevant reason (game engine chooses most important one).
This shows BehaviourChangeReason is more like a context hint for decision recalculation, not complete combat history. It tells extension “why priority needs recalculation this time,” but doesn’t guarantee bringing all past events. Therefore when writing AI, Builders shouldn’t assume single call can see complete hate chain or complete combat log; if really need long-term memory, should additionally design own config or statistics objects.
3. Return Format: ReturnTargetPriorityList
Extension function must ultimately return a priority list:
pub struct ReturnTargetPriorityList has copy, drop, store {
target_item_id: u64, // Target's in-game ID
priority_weight: u64, // Custom priority score (larger is higher priority)
}
Turret attacks target with highest priority_weight in list (ties attack first one).
In other words, extension returns suggested order, not an imperative interface for “immediately execute some attack action.” This difference is critical. Imperative interface means extension can overstep authority to control underlying weapon behavior, while priority interface only lets extension express preference on candidate set kernel already allowed, overall security boundary much more stable.
4. Default Priority Rules (Built-in Logic)
When Builder hasn’t configured extension, turret uses following default rules:
// Default weight increment constants
const STARTED_ATTACK_WEIGHT_INCREMENT: u64 = 10000; // Active attacker +10000
const ENTERED_WEIGHT_INCREMENT: u64 = 1000; // Entered range +1000
// world::turret::get_target_priority_list (default version)
pub fun get_target_priority_list(
turret: &Turret,
candidates: vector<TargetCandidate>,
): vector<ReturnTargetPriorityList> {
effective_weight_and_excluded(candidates)
}
fun effective_weight_and_excluded(
candidates: vector<TargetCandidate>,
): vector<ReturnTargetPriorityList> {
let mut result = vector::empty();
candidates.do!(|candidate| {
let weight = match (candidate.behaviour_change) {
BehaviourChangeReason::STARTED_ATTACK => {
candidate.priority_weight + STARTED_ATTACK_WEIGHT_INCREMENT
},
BehaviourChangeReason::ENTERED => {
candidate.priority_weight + ENTERED_WEIGHT_INCREMENT
},
_ => candidate.priority_weight,
};
// 0 means "exclude this target from attack," other values represent priority
if (weight > 0) {
result.push_back(ReturnTargetPriorityList {
target_item_id: candidate.item_id,
priority_weight: weight,
});
}
});
result
}
Default Strategy: Active attackers > Entered range > Others.
5. Extension Mechanism: TypeName Points to Extension Package
pub struct Turret has key {
id: UID,
// ...
extension: Option<TypeName>, // Stores Builder extension package's type name
}
When game engine needs to determine target priority:
- Read
turret.extension - If
None: Callworld::turret::get_target_priority_list(default logic) - If
Some(TypeName): Parse package ID → Call that package’sget_target_priority_listfunction
6. Developing Custom Turret AI
Scenario: Only Attack Alliance Adult Player Ships (Protect Newbies)
module my_turret::ai;
use world::turret::{Turret, TargetCandidate, ReturnTargetPriorityList};
use sui::dynamic_field as df;
/// Config: Newbie protection threshold (don't attack below this group_id)
public struct AiConfig has key {
id: UID,
protected_tribe_ids: vector<u32>, // Protected tribes (like newbie tribes)
prefer_aggressors: bool, // Whether to prioritize active attackers
}
/// This is standard entry function name game engine will call (fixed signature)
public fun get_target_priority_list(
turret: &Turret,
candidates: vector<TargetCandidate>,
ai_config: &AiConfig, // Builder's config object
): vector<ReturnTargetPriorityList> {
let mut result = vector::empty<ReturnTargetPriorityList>();
candidates.do!(|candidate| {
// Rule 1: Protected tribe → Skip (weight 0 = exclude)
if (vector::contains(&ai_config.protected_tribe_ids, &candidate.character_tribe)) {
return // Don't add to result list = don't attack
};
// Rule 2: Calculate priority weight
let mut weight: u64 = 1000; // Base weight
// Prioritize active attackers
if (candidate.is_aggressor && ai_config.prefer_aggressors) {
weight = weight + 50000;
};
// Lower HP higher priority (finishing blow strategy)
let hp_score = (100 - candidate.hp_ratio) * 100;
weight = weight + hp_score;
// Additional weight when shield broken
if (candidate.shield_ratio == 0) {
weight = weight + 5000;
};
result.push_back(ReturnTargetPriorityList {
target_item_id: candidate.item_id,
priority_weight: weight,
});
});
result
}
Strategy Comparison: Multiple AI Modes
Default AI:
Active attacker (+10000) > Entered range (+1000)
Finishing Blow AI (lowest HP priority):
is_aggressor bonus + (100-hp_ratio)*100 + shield_broken bonus
Elite Guard AI (protect allies):
Same tribe ships weight=0 + Enemy tribe sorted by hp_ratio
Anti-PvE AI (prioritize NPCs):
character_id==0 (NPC) → Super high weight + Players → Low weight
7. Authorizing Extension to Turret
Builder needs to first register extension’s TypeName to turret:
// Call function provided by world contract, register custom AI type to turret
// (Requires OwnerCap<Turret>)
turret::authorize_extension<my_turret::ai::AiType>(
turret,
owner_cap,
ctx,
);
Afterwards game engine will call that extension package’s get_target_priority_list when needing decisions.
In production environments, problems more often arise not from AI math formulas themselves, but from “whether extension actually hooked up.” That is, Builder troubleshooting order should first check whether authorization succeeded, whether turret online, whether config object readable, whether TypeName matches, then check weight algorithm. Otherwise easy to misdiagnose an authorization chain problem as AI logic problem.
8. Advanced: Dynamic AI Parameter Configuration
/// Allow turret AI to dynamically update config (no need to redeploy contract)
pub fun update_protection_list(
ai_config: &mut AiConfig,
admin: address,
new_protected_tribes: vector<u32>,
ctx: &TxContext,
) {
assert!(ctx.sender() == admin, 0);
ai_config.protected_tribe_ids = new_protected_tribes;
}
9. State Handling: OnlineReceipt
/// Proof of turret being online
pub struct OnlineReceipt {
turret_id: ID,
}
Turret needs to first confirm turret online before executing certain operations. OnlineReceipt is one-time credential for passing “confirmed online” proof in function chain, avoiding repeated checks.
10. Practice Exercises
- Basic AI: Implement “focus newbie protection” AI — prioritize ships with
hp_ratio > 80(almost full health, clearly veterans), set weight to 0 forhp_ratio < 30(possibly newbies) - Alliance Guardian AI: Read alliance member list, assign high priority to non-member ships, weight 0 for member ships
- Leaderboard AI: Record number of each ship type shot down by turret, automatically adjust strategy weekly (types shot down more have lower priority — because those players learned to avoid)
Chapter Summary
| Concept | Key Points |
|---|---|
TargetCandidate | Complete combat information of target candidate |
BehaviourChangeReason | ENTERED / STARTED_ATTACK / STOPPED_ATTACK |
ReturnTargetPriorityList | Return format: item_id + priority_weight (0=exclude) |
extension: Option<TypeName> | Turret stores extension package’s type name, engine dynamically calls |
| Default weights | STARTED_ATTACK +10000, ENTERED +1000 |
Next Chapter: KillMail System Deep Dive — Understanding EVE Frontier’s complete architecture for on-chain combat death records, from source code structure to interaction with Builder extensions.
Chapter 32: KillMail System Deep Dive
Learning Objective: Understand EVE Frontier’s complete architecture for on-chain combat death records — from source code structure to interaction methods with Builder extensions.
Status: Teaching example. Code in text is simplified for explanation; for source verification please refer to actual
world-contractsfiles in repository.
Minimal Call Chain
Game server -> AdminACL validation -> create_killmail -> derived_object::claim -> share_object -> emit event
Corresponding Code Directory
Key Structs
| Type | Purpose | Reading Focus |
|---|---|---|
Killmail | On-chain kill record shared object | How unique key, timestamp, kill parties and location are persisted |
LossType | Distinguish ship/structure loss | How it affects upper-layer business interpretation |
KillmailRegistry | Registry and index entry | How it avoids duplicate creation, how to locate records |
TenantItemId | In-game object to on-chain mapping key | How tenant + item_id forms stable business key |
Key Entry Functions
| Entry | Purpose | What to Confirm |
|---|---|---|
create_killmail | Create kill record | Whether sponsor validation, uniqueness validation, anti-replay are done first |
derived_object::claim related path | Generate deterministic object ID | Whether business key is stable, whether will be repeatedly claimed |
| Registry read/write entry | Establish lookup relationships | Whether Registry is just index, not record body itself |
Most Easily Misunderstood Points
Killmailisn’t pure event log, but queryable, indexable shared objectRegistryisn’t for “storing another copy of data,” but for stable retrieval and uniqueness constraints- Uniqueness comes from business key +
derived_objectpath, not randomly generating a new UID
When reading this chapter, best to bring two perspectives simultaneously: object perspective and index perspective. Object perspective cares about “what state is actually settled on-chain, can subsequent contracts read it directly”; index perspective cares about “how off-chain services stably discover it, aggregate it, locate it by business key.” Why KillMail is heavier than ordinary events is because EVE treats it as reusable world state for long-term, not one-time broadcast message. Many Builders first encountering this feel “since already emitting events, why also share_object a copy,” fundamental reason is here: events suit broadcasting and statistics, objects suit subsequent contract composition, permission validation and deterministic addressing.
2.1 What is KillMail?
In EVE Frontier, each player-vs-player (PvP) kill event generates an immutable record on-chain, called KillMail. This isn’t just a log — it’s a shared object with unique object ID that anyone can query on-chain.
On-chain Structure Relationships:
KillmailRegistry (registry)
└── Killmail (shared object)
├── killer_id : Killer TenantItemId
├── victim_id : Victim TenantItemId
├── kill_timestamp (Unix seconds)
├── loss_type : SHIP | STRUCTURE
└── solar_system_id : Location solar system
2.2 KillMail Core Data Structure
Source Code Deep Dive (world/sources/killmail/killmail.move)
// === Enums ===
/// Kill type: Ship or Structure
public enum LossType has copy, drop, store {
SHIP,
STRUCTURE,
}
/// On-chain KillMail shared object
public struct Killmail has key {
id: UID,
key: TenantItemId, // Deterministic ID from item_id + tenant
killer_id: TenantItemId,
victim_id: TenantItemId,
reported_by_character_id: TenantItemId,
kill_timestamp: u64, // Unix timestamp (seconds, not milliseconds!)
loss_type: LossType,
solar_system_id: TenantItemId,
}
Key Design: Killmail’s
idisn’t randomly generated, but deterministically derived viaderived_object::claim(registry, key)fromKillmailRegistry, ensuringitem_id → object_idmapping uniqueness.
What is TenantItemId?
// world/sources/primitives/in_game_id.move
public struct TenantItemId has copy, drop, store {
item_id: u64, // Game internal business ID
tenant: String, // Game tenant identifier (like "evefrontier")
}
// Creation method
let key = in_game_id::create_key(item_id, tenant);
This design allows same item_id to be reused across different tenants (different servers/game versions) without conflict.
2.3 KillMail Creation Flow
Full Flow Analysis
public fun create_killmail(
registry: &mut KillmailRegistry,
admin_acl: &AdminACL, // Only authorized servers can create
item_id: u64, // Kill record's in-game ID
killer_id: u64,
victim_id: u64,
reported_by_character: &Character, // Reporting character (must be present)
kill_timestamp: u64, // Unix seconds
loss_type: u8, // 1=SHIP, 2=STRUCTURE
solar_system_id: u64,
ctx: &mut TxContext,
) {
// 1. Verify caller is authorized server
admin_acl.verify_sponsor(ctx);
// 2. Generate key using reporter's tenant
let tenant = reported_by_character.tenant();
let killmail_key = in_game_id::create_key(item_id, tenant);
// 3. Prevent duplicate creation
assert!(!registry.object_exists(killmail_key), EKillmailAlreadyExists);
// 4. Verify key fields non-zero
assert!(item_id != 0, EKillmailIdEmpty);
assert!(killer_id != 0, ECharacterIdEmpty);
// ...
// 5. Derive deterministic UID from registry (core mechanism)
let killmail_uid = derived_object::claim(registry.borrow_registry_id(), killmail_key);
// 6. Create and share
let killmail = Killmail { id: killmail_uid, ... };
transfer::share_object(killmail);
}
Flow Diagram
Game Server → create_killmail()
↓
verify_sponsor (AdminACL check)
↓
create_key(item_id, tenant)
↓
object_exists? → Yes → ABORT EKillmailAlreadyExists
↓ No
derived_object::claim → Deterministic UID
↓
Killmail {..} → share_object
↓
emit KillmailCreatedEvent
2.4 Event System and Off-Chain Indexing
public struct KillmailCreatedEvent has copy, drop {
key: TenantItemId,
killer_id: TenantItemId,
victim_id: TenantItemId,
reported_by_character_id: TenantItemId,
loss_type: LossType,
kill_timestamp: u64,
solar_system_id: TenantItemId,
}
KillMail uses event indexing + object storage dual-track system:
| Component | Use |
|---|---|
On-chain shared object Killmail | Can be read by contracts, Builder extensions can query |
KillmailCreatedEvent | For index services to monitor in real-time, build leaderboards/statistics |
In this dual-track design, events aren’t state truth, but discovery mechanism. Indexers typically first learn “a new KillMail appeared” via events, then read object body on-chain based on object ID or business key. Benefit is off-chain leaderboards, achievement systems, battle reports can consume events at high throughput, but when actually involving reward distribution, dispute arbitration, subsequent extension reads/writes, can still return to object layer to get stable state. Otherwise if only relying on events, subsequent Builder contracts have no unified on-chain read entry.
2.5 How Do Builders Use KillMail?
Scenario: Kill Score Reward System
Builder can listen to KillmailCreatedEvent events, receive reward requests in own extension contract:
module my_pvp::kill_reward;
use world::killmail::Killmail;
use world::access::OwnerCap;
use sui::coin::{Self, Coin};
use sui::sui::SUI;
public struct RewardPool has key {
id: UID,
balance: Balance<SUI>,
reward_per_kill: u64,
owner: address,
}
/// Player submits KillMail object to claim SUI reward
pub fun claim_kill_reward(
pool: &mut RewardPool,
killmail: &Killmail, // Pass in on-chain KillMail object
character_id: ID, // Caller's character ID
ctx: &mut TxContext,
) {
// Verify killmail.killer_id corresponds to current caller's character
// (Actually needs OwnerCap verification)
assert!(balance::value(&pool.balance) >= pool.reward_per_kill, 0);
let reward = coin::take(&mut pool.balance, pool.reward_per_kill, ctx);
transfer::public_transfer(reward, ctx.sender());
}
Scenario: KillMail-Based NFT Badge
/// Mint "Centurion Badge" NFT after 100 kills
public fun mint_centurion_badge(
tracker: &KillTracker, // Self-built kill count tracking object
recipient: address,
ctx: &mut TxContext,
) {
assert!(tracker.kill_count >= 100, ENotEnoughKills);
// Mint NFT...
}
2.6 derived_object Pattern Deep Dive
KillMail uses Sui’s derived_object (deterministic object ID) pattern, an important design in EVE Frontier World contracts:
// Derive deterministic UID from registry
let killmail_uid = derived_object::claim(registry.borrow_registry_id(), killmail_key);
Why not use object::new(ctx)?
| Comparison | object::new(ctx) | derived_object::claim() |
|---|---|---|
| ID source | Random (based on tx digest) | Deterministic (based on key) |
| Duplicate creation | Cannot prevent (new ID each time) | Auto-prevents (key can only be used once) |
| Off-chain precomputation | Impossible | Possible (knowing key means knowing ID) |
| Use case | Ordinary objects | Game assets, KillMail and other objects with business IDs |
2.7 KillMail Registry Design
// world/sources/registry/killmail_registry.move
public struct KillmailRegistry has key {
id: UID,
// Note: No other fields! All data stored via derived_object
}
pub fun object_exists(registry: &KillmailRegistry, key: TenantItemId): bool {
derived_object::exists(®istry.id, key)
}
This registry is extremely minimal — it’s just a UID container, all KillMails exist as its derived children in Sui’s state tree.
The key design philosophy here: Registry doesn’t store business details, only provides namespace and uniqueness anchor. This approach is lighter than “Registry contains another Table<key, object_id>” because real uniqueness is already guaranteed by derived_object. You can understand it as a “parent directory” rather than “database table.” Once Builders understand this approach, later when seeing deterministic object patterns for characters, buildings, permits, credentials, it will be much easier.
2.8 Security Analysis
Only Server Can Create
admin_acl.verify_sponsor(ctx);
verify_sponsor checks if caller is in AdminACL.authorized_sponsors list. Ordinary players cannot forge KillMail — each kill record is signed by address linked to game server key.
Anti-Replay
assert!(!registry.object_exists(killmail_key), EKillmailAlreadyExists);
Using derived_object existence check naturally prevents same battle from being submitted repeatedly.
2.9 Practice Exercises
- Read KillMail: Write a PTB (programmable transaction block), pass in a KillMail object ID, print
killer_id,victim_id,kill_timestamp - Kill Score Contract: Implement score system based on KillMail, 100 points per ship kill, 50 points per structure kill
- KillMail NFT Credential: Design Builder extension allowing victim to claim “death compensation” based on KillMail object ID
Chapter Summary
| Concept | Key Points |
|---|---|
Killmail | Immutable shared object recording PvP kill events |
TenantItemId | item_id + tenant composite key, supports multi-tenancy |
derived_object | Deterministic object ID, prevents duplication, supports off-chain precomputation |
KillmailRegistry | Uses UID as parent node for derived children |
| Security mechanisms | AdminACL verification + derived_object anti-replay |
Next Chapter: zkLogin Principles — Understanding how EVE Vault uses zero-knowledge proofs to enable passwordless wallet access through OAuth login.
Example 8: Builder Competition System (On-Chain Leaderboard + Automatic Rewards)
Goal: Build an on-chain competition framework: within a fixed time window, players participate by staking points, leaderboard records on-chain, automatically settles on deadline, top three receive NFT trophies and token rewards.
Status: Code skeleton. Repository includes
Move.toml,weekly_race.moveand dApp directory, but point reporting authorization, reward asset types, and off-chain settlement sources still need completion based on your competition business.
Code Directory
Minimal Call Chain
Create competition -> Fund prize pool -> Off-chain aggregates points -> Server authorizes point reporting -> Deadline settlement -> Distribute prizes and trophies
Off-Chain Responsibility Boundaries
This example’s most error-prone area is not the leaderboard itself, but off-chain collaboration boundaries. Recommend separating responsibilities clearly:
- On-chain only handles: Competition lifecycle, prize pool funds, final settlement, trophy minting
- Server handles: Monitor jump events, aggregate scores by season, sign point reports
- Frontend handles: Display current points, trigger admin operations, read settlement results
If you temporarily cannot complete server signatures and point aggregation, don’t promote this example as “complete automated competition system”; more accurate description is “competition contract skeleton + leaderboard settlement model”.
Requirements Analysis
Scenario: You (Builder) hold weekly “Mining Area Contest”, compete on who jumps through your gate most times this week:
- 📅 Format: Starts every Sunday 00:00 UTC, ends next Saturday 23:59
- 📊 Points: +1 point per jump (reported by monitoring GateJumped events)
- 🏆 Rewards:
- 🥇 First Place: Champion NFT Trophy + 500 ALLY Token
- 🥈 Second Place: Elite NFT Trophy + 200 ALLY Token
- 🥉 Third Place: Contender NFT Trophy + 100 ALLY Token
- 💡 Key: Top three automatically determined by contract based on on-chain points, no manual intervention
Part 1: Competition Contract
module competition::weekly_race;
use sui::table::{Self, Table};
use sui::object::{Self, UID, ID};
use sui::clock::Clock;
use sui::coin::{Self, Coin};
use sui::sui::SUI;
use sui::balance::{Self, Balance};
use sui::event;
use sui::transfer;
use std::string::{Self, String, utf8};
// Note: This example omits actual project's `AdminACL` / `verify_sponsor`
// imports and off-chain leaderboard aggregation logic, example only shows contract modeling approach.
// ── Constants ──────────────────────────────────────────────────
const WEEK_DURATION_MS: u64 = 7 * 24 * 60 * 60 * 1000; // 7 days
// ── Data Structures ───────────────────────────────────────────────
/// Competition (create new one each week)
public struct Race has key {
id: UID,
season: u64, // Season number
start_time_ms: u64,
end_time_ms: u64,
scores: Table<address, u64>, // Player address → points
top3: vector<address>, // Top three (filled after settlement)
is_settled: bool,
prize_pool_sui: Balance<SUI>,
admin: address,
}
/// Trophy NFT
public struct TrophyNFT has key, store {
id: UID,
season: u64,
rank: u8, // 1, 2, 3
score: u64,
winner: address,
image_url: String,
}
public struct RaceAdminCap has key, store { id: UID }
// ── Events ──────────────────────────────────────────────────
public struct ScoreUpdated has copy, drop {
race_id: ID,
player: address,
new_score: u64,
}
public struct RaceSettled has copy, drop {
race_id: ID,
season: u64,
winner: address,
second: address,
third: address,
}
// ── Initialization ────────────────────────────────────────────────
fun init(ctx: &mut TxContext) {
transfer::transfer(RaceAdminCap { id: object::new(ctx) }, ctx.sender());
}
/// Create new competition
public fun create_race(
_cap: &RaceAdminCap,
season: u64,
clock: &Clock,
ctx: &mut TxContext,
) {
let start = clock.timestamp_ms();
let race = Race {
id: object::new(ctx),
season,
start_time_ms: start,
end_time_ms: start + WEEK_DURATION_MS,
scores: table::new(ctx),
top3: vector::empty(),
is_settled: false,
prize_pool_sui: balance::zero(),
admin: ctx.sender(),
};
transfer::share_object(race);
}
/// Fund prize pool
public fun fund_prize_pool(
race: &mut Race,
_cap: &RaceAdminCap,
coin: Coin<SUI>,
) {
balance::join(&mut race.prize_pool_sui, coin::into_balance(coin));
}
// ── Score Reporting (called by competition server or turret/gate extension) ────────────
public fun report_score(
race: &mut Race,
player: address,
score_delta: u64, // Points added this time
clock: &Clock,
admin_acl: &AdminACL, // Requires game server signature
ctx: &TxContext,
) {
verify_sponsor(admin_acl, ctx); // Verify authorized server
assert!(!race.is_settled, ERaceEnded);
assert!(clock.timestamp_ms() <= race.end_time_ms, ERaceEnded);
if !table::contains(&race.scores, player) {
table::add(&mut race.scores, player, 0u64);
};
let score = table::borrow_mut(&mut race.scores, player);
*score = *score + score_delta;
event::emit(ScoreUpdated {
race_id: object::id(race),
player,
new_score: *score,
});
}
// ── Settlement (requires off-chain calculation of top three then passed in)────────────────────────
public fun settle_race(
race: &mut Race,
_cap: &RaceAdminCap,
first: address,
second: address,
third: address,
clock: &Clock,
ctx: &mut TxContext,
) {
assert!(!race.is_settled, EAlreadySettled);
assert!(clock.timestamp_ms() >= race.end_time_ms, ERaceNotEnded);
// Verify on-chain scores (prevent fake rankings)
let s1 = *table::borrow(&race.scores, first);
let s2 = *table::borrow(&race.scores, second);
let s3 = *table::borrow(&race.scores, third);
assert!(s1 >= s2 && s2 >= s3, EInvalidRanking);
race.is_settled = true;
race.top3 = vector[first, second, third];
// Distribute prize pool: 50% to first, 30% to second, 20% to third
let total = balance::value(&race.prize_pool_sui);
let prize1 = coin::take(&mut race.prize_pool_sui, total * 50 / 100, ctx);
let prize2 = coin::take(&mut race.prize_pool_sui, total * 30 / 100, ctx);
let prize3 = coin::take(&mut race.prize_pool_sui, balance::value(&race.prize_pool_sui), ctx);
transfer::public_transfer(prize1, first);
transfer::public_transfer(prize2, second);
transfer::public_transfer(prize3, third);
// Mint trophy NFTs
mint_trophy(race.season, 1, s1, first, ctx);
mint_trophy(race.season, 2, s2, second, ctx);
mint_trophy(race.season, 3, s3, third, ctx);
event::emit(RaceSettled {
race_id: object::id(race),
season: race.season,
winner: first,
second,
third,
});
}
fun mint_trophy(
season: u64,
rank: u8,
score: u64,
winner: address,
ctx: &mut TxContext,
) {
let (name, image_url) = match(rank) {
1 => (b"Champion Trophy", b"https://assets.example.com/trophies/gold.png"),
2 => (b"Elite Trophy", b"https://assets.example.com/trophies/silver.png"),
_ => (b"Contender Trophy", b"https://assets.example.com/trophies/bronze.png"),
};
let trophy = TrophyNFT {
id: object::new(ctx),
season,
rank,
score,
winner,
image_url: utf8(image_url),
};
transfer::public_transfer(trophy, winner);
}
const ERaceEnded: u64 = 0;
const EAlreadySettled: u64 = 1;
const ERaceNotEnded: u64 = 2;
const EInvalidRanking: u64 = 3;
Part 2: Settlement Script (Off-Chain Ranking + On-Chain Settlement)
// scripts/settle-race.ts
import { SuiClient } from "@mysten/sui/client"
import { Transaction } from "@mysten/sui/transactions"
import { Ed25519Keypair } from "@mysten/sui/keypairs/ed25519"
const RACE_PKG = "0x_COMPETITION_PACKAGE_"
const RACE_ID = "0x_RACE_ID_"
async function settleRace() {
const client = new SuiClient({ url: "https://fullnode.testnet.sui.io:443" })
const adminKeypair = Ed25519Keypair.fromSecretKey(/* ... */)
// 1. Read all scores from on-chain (aggregate via ScoreUpdated events)
const scoreMap = new Map<string, number>()
let cursor = null
do {
const page = await client.queryEvents({
query: { MoveEventType: `${RACE_PKG}::weekly_race::ScoreUpdated` },
cursor,
limit: 200,
})
for (const event of page.data) {
const { player, new_score } = event.parsedJson as any
scoreMap.set(player, Number(new_score)) // Take latest value
}
cursor = page.nextCursor
} while (cursor)
// 2. Sort to find top three
const sorted = [...scoreMap.entries()]
.sort((a, b) => b[1] - a[1])
if (sorted.length < 3) {
console.log("Insufficient participants, cannot settle")
return
}
const [first, second, third] = sorted.slice(0, 3).map(([addr]) => addr)
console.log(`First: ${first} (${sorted[0][1]} points)`)
console.log(`Second: ${second} (${sorted[1][1]} points)`)
console.log(`Third: ${third} (${sorted[2][1]} points)`)
// 3. Submit settlement transaction
const tx = new Transaction()
tx.moveCall({
target: `${RACE_PKG}::weekly_race::settle_race`,
arguments: [
tx.object(RACE_ID),
tx.object("ADMIN_CAP_ID"),
tx.pure.address(first),
tx.pure.address(second),
tx.pure.address(third),
tx.object("0x6"), // Clock
],
})
const result = await client.signAndExecuteTransaction({
signer: adminKeypair,
transaction: tx,
})
console.log("Settlement successful! Trophies distributed. Tx:", result.digest)
}
settleRace()
Part 3: Real-Time Leaderboard dApp
// src/LeaderboardApp.tsx
import { useEffect, useState } from 'react'
import { useRealtimeEvents } from './hooks/useRealtimeEvents'
const RACE_PKG = "0x_COMPETITION_PACKAGE_"
interface ScoreEntry {
rank: number
address: string
score: number
}
export function LeaderboardApp() {
const [scores, setScores] = useState<Map<string, number>>(new Map())
const [timeLeft, setTimeLeft] = useState('')
const raceEnd = new Date('2026-03-08T00:00:00Z').getTime()
// Real-time subscribe to score updates
const events = useRealtimeEvents<{ player: string; new_score: string }>(
`${RACE_PKG}::weekly_race::ScoreUpdated`
)
useEffect(() => {
const updated = new Map(scores)
for (const e of events) {
updated.set(e.player, Number(e.new_score))
}
setScores(updated)
}, [events])
// Countdown
useEffect(() => {
const timer = setInterval(() => {
const diff = raceEnd - Date.now()
if (diff <= 0) { setTimeLeft('Ended'); return }
const d = Math.floor(diff / 86400000)
const h = Math.floor((diff % 86400000) / 3600000)
const m = Math.floor((diff % 3600000) / 60000)
setTimeLeft(`${d}d ${h}h ${m}m`)
}, 1000)
return () => clearInterval(timer)
}, [])
const sorted: ScoreEntry[] = [...scores.entries()]
.sort((a, b) => b[1] - a[1])
.slice(0, 10)
.map(([address, score], i) => ({ rank: i + 1, address, score }))
const medals = ['🥇', '🥈', '🥉']
return (
<div className="leaderboard">
<header>
<h1>🏆 First Gate Jump Competition</h1>
<div className="countdown">
⏳ Time Remaining: <strong>{timeLeft}</strong>
</div>
</header>
<table className="ranking-table">
<thead>
<tr><th>Rank</th><th>Player</th><th>Jump Count</th></tr>
</thead>
<tbody>
{sorted.map(({ rank, address, score }) => (
<tr key={address} className={rank <= 3 ? 'top3' : ''}>
<td>{medals[rank - 1] ?? rank}</td>
<td>{address.slice(0, 6)}...{address.slice(-4)}</td>
<td><strong>{score}</strong> times</td>
</tr>
))}
{sorted.length === 0 && (
<tr><td colSpan={3}>No data yet, waiting for first jump...</td></tr>
)}
</tbody>
</table>
</div>
)
}
🎯 Complete Review
Contract Layer
├── weekly_race.move
│ ├── Race (shared object, one per season)
│ ├── TrophyNFT (trophy object)
│ ├── create_race() ← Admin creates
│ ├── fund_prize_pool() ← Admin funds prize pool
│ ├── report_score() ← Server reports points (AdminACL verification)
│ └── settle_race() ← Admin passes in top three, contract verifies and settles
Settlement Script
└── settle-race.ts
├── QueryEvents aggregates all points
├── Sort to calculate top three
└── Submit settle_race() transaction
dApp Layer
└── LeaderboardApp.tsx
├── subscribeEvent real-time updates leaderboard
└── Competition countdown
🔧 Extension Exercises
- Anti-Score Farming: Rate limit in
report_score(each player max 60 points per minute) - Public Verification: Store hash of raw data for each score report on-chain, allow anyone to verify final ranking
- Season System: Admin cannot end current competition early, contract enforces timeline
📚 Related Documentation
- Chapter 8: Sponsored Transactions & Server Integration
- Chapter 9: Off-Chain Indexing
- Chapter 13: NFT Design
Practical Case 10: Space Resource Warfare (Comprehensive Practice)
Objective: Integrate all knowledge from this course to build a miniature complete game: two alliances competing for control of a mining area, including turret offense/defense, stargate tolls, item storage, token rewards, and real-time battle report dApp.
Status: Comprehensive case. The main text integrates multiple modules and is the best case to verify whether you’ve truly connected the first half of the book.
Corresponding Code Directory
Minimal Call Chain
Issue faction NFT -> Stargate/Turret faction verification -> Player mining rewards -> WAR Token distribution -> dApp displays battle status
Project Overview
┌─────────────────────────────────────────────┐
│ Space Resource Warfare │
│ │
│ Alliance A Alliance B │
│ Territory (Turret ×2) Territory (Turret ×2)│
│ ↑ ↑ │
│ ┌─[Gate A1]─── Neutral Mining Area ───[Gate B1]─┐ │
│ │ (Storage Box + Resources) │ │
│ └─────────────────────────────────────┘ │
│ │
│ Battle Rules: │
│ • Entering neutral mining area requires passing opponent's turret check │
│ • Must hold "Faction NFT" to pass own stargate │
│ • Mining area resources refresh hourly, first come first served │
│ • Each mining operation earns WAR Token (alliance token) │
└─────────────────────────────────────────────┘
Contract Architecture Design
war_game/
├── Move.toml
└── sources/
├── faction_nft.move # Faction NFT (alliance membership credential)
├── war_token.move # WAR Token (war token)
├── faction_gate.move # Stargate extension (faction check)
├── faction_turret.move # Turret extension (enemy detection)
├── mining_depot.move # Mining area storage box extension (resource collection)
└── war_registry.move # Game registry (global state)
Part One: Core Contracts
faction_nft.move
module war_game::faction_nft;
use sui::object::{Self, UID};
use sui::transfer;
use std::string::{Self, String, utf8};
public struct FACTION_NFT has drop {}
/// Faction enumeration
const FACTION_ALPHA: u8 = 0;
const FACTION_BETA: u8 = 1;
/// Faction NFT (alliance membership proof)
public struct FactionNFT has key, store {
id: UID,
faction: u8, // 0 = Alpha, 1 = Beta
member_since_ms: u64,
name: String,
}
public struct WarAdminCap has key, store { id: UID }
public fun enlist(
_admin: &WarAdminCap,
faction: u8,
member_name: vector<u8>,
recipient: address,
clock: &Clock,
ctx: &mut TxContext,
) {
assert!(faction == FACTION_ALPHA || faction == FACTION_BETA, EInvalidFaction);
let nft = FactionNFT {
id: object::new(ctx),
faction,
member_since_ms: clock.timestamp_ms(),
name: utf8(member_name),
};
transfer::public_transfer(nft, recipient);
}
public fun get_faction(nft: &FactionNFT): u8 { nft.faction }
public fun is_alpha(nft: &FactionNFT): bool { nft.faction == FACTION_ALPHA }
public fun is_beta(nft: &FactionNFT): bool { nft.faction == FACTION_BETA }
const EInvalidFaction: u64 = 0;
war_token.move
module war_game::war_token;
/// WAR Token (standard Coin design, see Chapter 14)
public struct WAR_TOKEN has drop {}
fun init(witness: WAR_TOKEN, ctx: &mut TxContext) {
let (treasury, metadata) = sui::coin::create_currency(
witness, 6, b"WAR", b"War Token",
b"Earned through combat and mining in the Space Resource War",
option::none(), ctx,
);
transfer::public_transfer(treasury, ctx.sender());
transfer::public_freeze_object(metadata);
}
faction_gate.move (Stargate Extension)
module war_game::faction_gate;
use war_game::faction_nft::{Self, FactionNFT};
use world::gate::{Self, Gate};
use world::character::Character;
use sui::clock::Clock;
use sui::tx_context::TxContext;
public struct AlphaGateAuth has drop {}
public struct BetaGateAuth has drop {}
/// Alpha alliance stargate: only allows Alpha members to pass
public fun alpha_gate_jump(
source_gate: &Gate,
dest_gate: &Gate,
character: &Character,
faction_nft: &FactionNFT,
clock: &Clock,
ctx: &mut TxContext,
) {
assert!(faction_nft::is_alpha(faction_nft), EWrongFaction);
gate::issue_jump_permit(
source_gate, dest_gate, character, AlphaGateAuth {},
clock.timestamp_ms() + 30 * 60 * 1000, ctx,
);
}
/// Beta alliance stargate
public fun beta_gate_jump(
source_gate: &Gate,
dest_gate: &Gate,
character: &Character,
faction_nft: &FactionNFT,
clock: &Clock,
ctx: &mut TxContext,
) {
assert!(faction_nft::is_beta(faction_nft), EWrongFaction);
gate::issue_jump_permit(
source_gate, dest_gate, character, BetaGateAuth {},
clock.timestamp_ms() + 30 * 60 * 1000, ctx,
);
}
const EWrongFaction: u64 = 0;
mining_depot.move (Mining Area Core)
module war_game::mining_depot;
use war_game::faction_nft::{Self, FactionNFT};
use war_game::war_token::WAR_TOKEN;
use world::storage_unit::{Self, StorageUnit};
use world::character::Character;
use sui::coin::{Self, TreasuryCap};
use sui::clock::Clock;
use sui::object::{Self, UID};
use sui::event;
public struct MiningAuth has drop {}
/// Mining area state
public struct MiningDepot has key {
id: UID,
resource_count: u64, // Current available quantity
last_refresh_ms: u64, // Last refresh time
refresh_amount: u64, // Amount replenished per refresh
refresh_interval_ms: u64, // Refresh interval
alpha_total_mined: u64,
beta_total_mined: u64,
}
public struct ResourceMined has copy, drop {
miner: address,
faction: u8,
amount: u64,
faction_total: u64,
}
/// Mining (checks faction NFT and distributes WAR Token reward)
public fun mine(
depot: &mut MiningDepot,
storage_unit: &mut StorageUnit,
character: &Character,
faction_nft: &FactionNFT, // Requires faction authentication
war_treasury: &mut TreasuryCap<WAR_TOKEN>,
amount: u64,
clock: &Clock,
ctx: &mut TxContext,
) {
// Auto refresh resources
maybe_refresh(depot, clock);
assert!(amount > 0 && amount <= depot.resource_count, EInsufficientResource);
depot.resource_count = depot.resource_count - amount;
// Update statistics based on faction
let faction = faction_nft::get_faction(faction_nft);
if faction == 0 {
depot.alpha_total_mined = depot.alpha_total_mined + amount;
} else {
depot.beta_total_mined = depot.beta_total_mined + amount;
};
// Withdraw resources (from SSU)
// storage_unit::withdraw_batch(storage_unit, character, MiningAuth {}, RESOURCE_TYPE_ID, amount, ctx)
// Distribute WAR Token reward (10 WAR per resource unit)
let war_reward = amount * 10_000_000; // 10 WAR per unit, 6 decimals
let war_coin = sui::coin::mint(war_treasury, war_reward, ctx);
sui::transfer::public_transfer(war_coin, ctx.sender());
event::emit(ResourceMined {
miner: ctx.sender(),
faction,
amount,
faction_total: if faction == 0 { depot.alpha_total_mined } else { depot.beta_total_mined },
});
}
fun maybe_refresh(depot: &mut MiningDepot, clock: &Clock) {
let now = clock.timestamp_ms();
if now >= depot.last_refresh_ms + depot.refresh_interval_ms {
depot.resource_count = depot.resource_count + depot.refresh_amount;
depot.last_refresh_ms = now;
}
}
const EInsufficientResource: u64 = 0;
Part Two: Real-time Battle Report dApp
// src/WarDashboard.tsx
import { useState, useEffect } from 'react'
import { useRealtimeEvents } from './hooks/useRealtimeEvents'
import { useCurrentClient } from '@mysten/dapp-kit-react'
import { useConnection } from '@evefrontier/dapp-kit'
const WAR_PKG = "0x_WAR_PACKAGE_"
const DEPOT_ID = "0x_DEPOT_ID_"
interface DepotState {
resource_count: string
alpha_total_mined: string
beta_total_mined: string
last_refresh_ms: string
}
interface MiningEvent {
miner: string
faction: string
amount: string
faction_total: string
}
const FACTION_COLOR = { '0': '#3B82F6', '1': '#EF4444' } // Alpha=blue, Beta=red
const FACTION_NAME = { '0': 'Alpha Alliance', '1': 'Beta Alliance' }
export function WarDashboard() {
const { isConnected, currentAddress } = useConnection()
const client = useCurrentClient()
const [depot, setDepot] = useState<DepotState | null>(null)
const [nextRefreshIn, setNextRefreshIn] = useState(0)
// Load mining area state
const loadDepot = async () => {
const obj = await client.getObject({ id: DEPOT_ID, options: { showContent: true } })
if (obj.data?.content?.dataType === 'moveObject') {
setDepot(obj.data.content.fields as DepotState)
}
}
useEffect(() => { loadDepot() }, [])
// Refresh countdown
useEffect(() => {
if (!depot) return
const timer = setInterval(() => {
const refreshInterval = 60 * 60 * 1000 // 1 hour
const nextRefresh = Number(depot.last_refresh_ms) + refreshInterval
setNextRefreshIn(Math.max(0, nextRefresh - Date.now()))
}, 1000)
return () => clearInterval(timer)
}, [depot])
// Real-time battle report
const miningEvents = useRealtimeEvents<MiningEvent>(
`${WAR_PKG}::mining_depot::ResourceMined`,
{ maxEvents: 20 }
)
useEffect(() => {
if (miningEvents.length > 0) loadDepot() // Refresh mining area state when mining events occur
}, [miningEvents])
// Calculate territory control percentage
const alpha = Number(depot?.alpha_total_mined ?? 0)
const beta = Number(depot?.beta_total_mined ?? 0)
const total = alpha + beta
const alphaPct = total > 0 ? Math.round(alpha * 100 / total) : 50
return (
<div className="war-dashboard">
<h1>Space Resource Warfare</h1>
{/* Faction control rate */}
<section className="control-bar-section">
<div className="control-labels">
<span style={{ color: FACTION_COLOR['0'] }}>
Alpha {alphaPct}%
</span>
<span style={{ color: FACTION_COLOR['1'] }}>
{100 - alphaPct}% Beta
</span>
</div>
<div className="control-bar">
<div
className="alpha-bar"
style={{ width: `${alphaPct}%`, background: FACTION_COLOR['0'] }}
/>
</div>
</section>
{/* Mining area status */}
<section className="depot-status">
<div className="stat-card">
<span>Remaining Resources</span>
<strong>{depot?.resource_count ?? '-'}</strong>
</div>
<div className="stat-card">
<span>Next Refresh</span>
<strong>{Math.ceil(nextRefreshIn / 60000)} minutes</strong>
</div>
<div className="stat-card alpha">
<span style={{ color: FACTION_COLOR['0'] }}>Alpha Total Mined</span>
<strong>{depot?.alpha_total_mined ?? '-'}</strong>
</div>
<div className="stat-card beta">
<span style={{ color: FACTION_COLOR['1'] }}>Beta Total Mined</span>
<strong>{depot?.beta_total_mined ?? '-'}</strong>
</div>
</section>
{/* Real-time battle report */}
<section className="battle-log">
<h3>Real-time Battle Report</h3>
{miningEvents.length === 0 ? (
<p className="quiet">Mining area is quiet...</p>
) : (
<ul>
{miningEvents.map((e, i) => (
<li
key={i}
style={{ borderLeftColor: FACTION_COLOR[e.faction as '0' | '1'] }}
>
<span className="faction-tag" style={{ color: FACTION_COLOR[e.faction as '0' | '1'] }}>
[{FACTION_NAME[e.faction as '0' | '1']}]
</span>
{e.miner.slice(0, 8)}... collected {e.amount} units of resources
</li>
))}
</ul>
)}
</section>
</div>
)
}
Complete Deployment Process
# 1. Compile and publish contracts
cd war_game
sui move build
sui client publish --gas-budget 200000000
# 2. Initialize game objects
# Run scripts/init-game.ts: create MiningDepot, register stargate/turret extensions
# 3. Test player enlistment
# scripts/enlist-player.ts: issue FactionNFT to test players
# 4. Start dApp
cd dapp
npm run dev
Knowledge Integration
| Course Knowledge Point | Application in This Example |
|---|---|
| Chapter 3: Witness Pattern | MiningAuth, AlphaGateAuth, BetaGateAuth |
| Chapter 4: Component Extension Registration | Turret + Stargate + Storage box all have independent extensions |
| Chapter 5: dApp + Hooks | useRealtimeEvents drives real-time battle report updates |
| Chapter 11: OwnerCap | Alliance Leader holds OwnerCap of each component |
| Chapter 12: Event System | ResourceMined event drives dApp |
| Chapter 14: Token Economy | WAR Token as mining reward |
| Chapter 17: Security Audit | Permission verification + resource deduction without exceeding |
| Chapter 23: Publishing Process | Multiple contracts published simultaneously + initialization scripts |
| Chapter 8: Sponsored Transactions | Turret attack verification requires server signature |
| Chapter 9: GraphQL | Real-time query of mining area and battle status |
| Chapter 15: Cross-contract | mining_depot calls faction_nft read-only view |
| Chapter 13: NFT | FactionNFT Display shows faction information |
Advanced Challenges
- Alliance Expulsion: Leader can revoke FactionNFT of inactive members (transfer back to Admin or destroy)
- Resource Market: Deploy SSU near mining area, players can sell mined resources back to alliance for more WAR Token
- War Settlement: After 7 days, the alliance with the most total mining automatically receives the prize pool, contract auto-settles dividends
Congratulations! You’ve Completed All Practical Cases
At this point, you have:
- Written 10 different types of contracts in Move from scratch
- Built 10 complete frontend dApps
- Mastered the complete tech stack from NFT, marketplace to DAO, competitions
- Understood chain-on and off-chain collaborative design patterns
You now possess all the technical capabilities to build complete commercial products in EVE Frontier.
Related Documentation for Full Course
Chapter 33: EVE Vault Wallet Overview — zkLogin Principles and Design
Learning Objective: Understand what EVE Vault is, why it uses zkLogin instead of traditional private keys, and the complete cryptographic working principles of zkLogin.
Status: Source code guide. Cryptographic details based on current EVE Vault implementation and Sui zkLogin mechanism; text emphasizes architectural understanding.
Minimal Call Chain
FusionAuth/OAuth login -> Callback get code -> Exchange token -> Derive zkLogin address -> Save login state -> Wallet can sign
Corresponding Code Directory
1. What is EVE Vault?
EVE Vault is EVE Frontier’s dedicated Chrome browser extension wallet, built on the following tech stack:
| Layer | Technology | Purpose |
|---|---|---|
| Extension Framework | WXT + Chrome MV3 | Cross-browser extension building |
| UI Framework | React + TanStack Router | Popup and approval pages |
| State Management | Zustand + Chrome Storage | Persist user state |
| Blockchain | Sui Wallet Standard | dApp discovery and interaction protocol |
| Identity Auth | EVE Frontier FusionAuth (OAuth) | EVE game account login |
| Address Derivation | Sui zkLogin + Enoki | Derive on-chain address from OAuth identity |
Core Design Philosophy: Players don’t need to manage private keys — log in with EVE Frontier game account, automatically get Sui blockchain address.
The most important thing in this chapter isn’t memorizing cryptographic terms, but first seeing clearly who it’s solving what problem for:
- For players: Lower wallet barrier
- For Builders: Lower onboarding cost
- For product: Consolidate “game identity” and “on-chain identity” into one experience chain as much as possible
2. Why Not Use Traditional Private Keys?
Pain points of ordinary Sui wallets:
❌ Players need to safeguard mnemonic phrase (12-24 words)
❌ Mnemonic leak = total asset loss
❌ Game account and on-chain identity are two independent systems
❌ Extremely high learning curve for new users
EVE Vault’s solution:
✅ Use EVE Frontier game account (email login) to directly correspond to on-chain address
✅ Address deterministically derived by zero-knowledge proof (zkLogin)
✅ Even if OAuth token stolen, needs ZK proof to sign
✅ Game account = on-chain identity, seamless user experience
Real Product Significance Here
Not “more advanced,” but letting vast numbers of players who wouldn’t install traditional wallets also enter on-chain interactions.
For EVE Builders, this directly affects:
- How short connection flow can be
- How low first-use psychological cost is
- Whether sponsored transactions can remove last layer of friction
3. zkLogin Principles Deep Dive
3.1 Core Concepts
zkLogin is a signature scheme natively supported by Sui that binds OAuth identity to blockchain address:
【Traditional Wallet】
Private key k → Public key PK → Address A
Signature = ed25519_sign(k, tx)
【zkLogin Wallet】
JWT (OAuth token) + Ephemeral Key → ZK Proof → Signature
↑
This proof proves "I hold valid JWT and JWT corresponds to address A"
3.2 zkLogin Address Formula
zkLogin_address = hash(
iss, // JWT issuer (like "https://auth.evefrontier.com")
sub, // User's unique ID (EVE account ID)
aud, // OAuth client ID
user_salt, // User salt saved by Enoki (prevent sub leaking link to on-chain identity)
)
Key Security: Even if attacker knows your EVE account ID, without user_salt cannot calculate your on-chain address. user_salt is kept by Enoki (Mysten Labs’ zkLogin service).
Most Important zkLogin Intuition
Not “I have a long-term private key,” but:
I use OAuth identity to prove “who I am,” then use ephemeral key to prove “this operation is authorized by me.”
This is why zkLogin system has simultaneously:
- JWT
- salt
- Ephemeral key
- ZK proof
They each prove different things.
3.3 Ephemeral Key Pair
zkLogin uses an ephemeral key pair to perform actual signing:
Login flow:
1. Generate ephemeral ed25519 key pair (validity = Sui Epoch, ~24h)
2. Embed ephemeral public key's nonce into OAuth request
3. OAuth server returns JWT containing nonce in token
4. Sign transaction with ephemeral private key
5. Submit ZK proof + ephemeral signature → Sui verifies
Ephemeral private key stored in EVE Vault’s Keeper security container (see Chapter 34).
Why Must Have Ephemeral Key?
Because zkLogin doesn’t directly turn OAuth token into signer. Ephemeral key’s role is connecting “login state” and “specific signing action,” while limiting risk window to relatively short period.
3.4 ZK Proof Generation
// packages/shared/src/wallet/zkProof.ts
interface ZkProofParams {
jwtRandomness: string; // Random salt, prevent nonce derivation
maxEpoch: string; // Ephemeral key's max valid Epoch
ephemeralPublicKey: PublicKey; // Ephemeral public key (embedded in JWT nonce)
idToken: string; // JWT obtained from FusionAuth
enokiApiKey: string; // Enoki service key
network?: string; // devnet | testnet | mainnet
}
ZK Proof generation steps:
- Collect above parameters
- Call Sui ZK Prover endpoint (Enoki hosted)
- Return ZK proof containing
proofPoints,issBase64Details,headerBase64
3.5 JWT Nonce Construction
zkLogin’s most critical design is “embedding” ephemeral public key into JWT, achieved through nonce field:
// nonce = poseidon_hash(ephemeral_public_key, max_epoch, randomness)
// This step completed before requesting OAuth
const nonce = generateNonce(ephemeralPublicKey, maxEpoch, randomness);
// Pass nonce in OAuth URL
const authUrl = `${fusionAuthUrl}/oauth2/authorize?`
+ `client_id=${CLIENT_ID}`
+ `&response_type=code`
+ `&nonce=${nonce}` // ← FusionAuth will put this nonce into JWT
+ `&scope=openid+profile+email`;
FusionAuth includes in returned JWT (id_token):
{
"iss": "https://auth.evefrontier.com",
"sub": "user-12345", ← EVE account unique ID
"aud": "your-client-id",
"nonce": "H5SmVjkG...", ← Contains ephemeral public key info
"exp": 1712345678
}
Sui’s ZK verifier confirms signature truly comes from that ephemeral key pair by checking ephemeral public key embedded in nonce.
3.6 zkLogin Address TypeScript Calculation
import { computeZkLoginAddress } from "@mysten/sui/zklogin";
// Get user_salt and address from Enoki API
const { address, salt } = await fetch("https://api.enoki.mystenlabs.com/v1/zklogin", {
method: "POST",
headers: { "Authorization": `Bearer ${ENOKI_API_KEY}` },
body: JSON.stringify({ jwt: idToken }),
}).then(r => r.json());
// Verify: compute address locally (same as Enoki-returned address)
const localAddress = computeZkLoginAddress({
claimName: "sub",
claimValue: decodedJwt.sub, // EVE account ID
iss: decodedJwt.iss,
aud: decodedJwt.aud,
userSalt: BigInt(salt),
});
console.assert(address === localAddress);
4. EVE Vault Authentication Flow
User clicks "Sign in with EVE Vault"
│
▼
Generate ephemeral Ed25519 key pair + JWT Nonce
│
▼
Open FusionAuth OAuth page (chrome.identity API)
│
▼
User logs in with EVE Frontier account
│
▼
FusionAuth returns JWT (containing nonce)
│
▼
Call Enoki API → Get user_salt + zkLogin address
│
▼
Call ZK Prover → Generate ZK Proof
│
▼
Popup displays zkLogin address + SUI balance
│
▼
dApp calls wallet.connect() → Get address → Can send transactions
Most Critical in This Flow Isn’t Many Steps, But Clear Responsibilities
- FusionAuth responsible for confirming who user is
- Enoki responsible for assisting zkLogin address and salt
- Prover responsible for generating verifiable proof
- Vault responsible for organizing these into wallet capability
Once you separate each layer’s responsibilities, this mechanism won’t seem mysterious.
5. Multi-Network Support
EVE Vault supports connecting to multiple testnets simultaneously, can switch anytime:
// packages/shared/src/types/wallet.ts
export class EveVaultWallet implements Wallet {
#currentChain: SuiChain = SUI_TESTNET_CHAIN;
get chains(): Wallet["chains"] {
return [SUI_TESTNET_CHAIN, SUI_DEVNET_CHAIN] as `sui:${string}`[];
}
// ...
}
Network switcher in popup’s bottom-left corner lets players switch between Devnet (development testing) and Testnet (demo/pre-launch), after switching address same (because derivation formula doesn’t contain network parameter), but queried nodes will switch.
6. EVE Vault vs Traditional Sui Wallet Comparison
| Feature | Sui Wallet / OKX | EVE Vault |
|---|---|---|
| Need mnemonic | ✅ Yes | ❌ No |
| Based on OAuth login | ❌ No | ✅ Yes (EVE account) |
| Private key storage location | User local | No private key (zkLogin) |
| Address determinism | Depends on private key | JWT + salt deterministic derivation |
| Signature scheme | ed25519 / secp256k1 | zkLogin (ZK Proof + ephemeral signature) |
| Sponsored transactions | Partial support | ✅ EVE Frontier native support |
| dApp discover | Wallet Standard | Wallet Standard + EVE extension features |
7. Security Model
Keeper Mechanism
Ephemeral private key not stored in chrome.storage (readable by JS), but in Keeper (isolated hidden document):
┌─────────────────────────────────────────┐
│ Chrome Extension Sandbox │
│ │
│ Background Service Worker │
│ ↕ chrome.runtime.sendMessage │
│ Keeper (hidden iframe/document) │
│ ← Ephemeral private key only in this memory │
│ ← Not written to chrome.storage │
│ ← Cleared when browser closes │
└─────────────────────────────────────────┘
Lock Mechanism
After browser closes or period of inactivity, Keeper automatically clears ephemeral private key (“locked” state). Unlocking requires regenerating ZK Proof (cached, usually completes in seconds).
8. Significance for Builders
As Builder, your dApp users will connect through EVE Vault, key impacts:
- No private key management UX: Users directly connect with game account, lower onboarding barrier
- Sponsored transaction native support: EVE Vault implements
sign_sponsored_transaction, Builder can pay Gas for users - Address stability: Player’s on-chain address bound to their EVE account, won’t change due to “device change”
- Multi-network: Development uses Devnet, launch uses Testnet, address unchanged
Chapter Summary
| Concept | Key Points |
|---|---|
| zkLogin | Passwordless zero-knowledge signature scheme based on OAuth JWT |
user_salt | Kept by Enoki, prevents OAuth ID linking to on-chain address |
| Ephemeral key pair | Regenerated each Epoch, Keeper security container storage |
| ZK Proof | Requested from Enoki, proves “legitimate JWT holder” |
| FusionAuth | EVE Frontier’s OAuth identity provider |
Next Chapter: EVE Vault Technical Architecture and Development Deployment — Chrome MV3’s 5 script layers, message communication protocol, and how to locally build and load extension.
Chapter 34: EVE Vault Technical Architecture and Development Deployment
Learning Objective: Understand EVE Vault’s Chrome MV3 architecture (5 script layers, message protocol, Keeper security container), master complete flow for locally building and debugging extension, and division of responsibilities among packages in Monorepo.
Status: Source code guide. Recommend reading while opening extension entry points and background code to verify message flows.
Minimal Call Chain
Page/content script request -> background dispatches message -> keeper protects sensitive state -> approval page signs -> response returns to caller
Corresponding Code Directory
1. Project Structure (Monorepo)
evevault/
├── apps/
│ ├── extension/ # Chrome MV3 extension (main body)
│ │ ├── entrypoints/ # WXT entry points (each = independent page/script)
│ │ │ ├── background.ts # Service Worker (background resident)
│ │ │ ├── content.ts # Content script (injected into each page)
│ │ │ ├── injected.ts # Page context script (register wallet)
│ │ │ ├── popup/ # Extension popup
│ │ │ ├── sign_transaction/ # Transaction approval page
│ │ │ ├── sign_sponsored_transaction/ # Sponsored transaction approval page
│ │ │ ├── sign_personal_message/ # Message signing approval page
│ │ │ ├── sign_and_execute_transaction/
│ │ │ └── keeper/ # Security key container
│ │ └── src/
│ │ ├── features/ # Feature modules (auth, wallet)
│ │ ├── lib/ # Core library (adapters, background, utils)
│ │ └── routes/ # React routes (TanStack Router)
│ └── web/ # Web version (coming soon)
└── packages/
└── shared/ # Cross-app shared: types, Sui client, utility functions
└── src/
├── types/ # Message types, wallet types, auth types
├── sui/ # SuiClient, GraphQL client
└── auth/ # Enoki integration, zkLogin tools
Build Tools: Bun (package management) + Turborepo (build cache) + WXT (extension framework)
What’s really worth understanding about Monorepo here:
Vault isn’t a single-page extension, but a group of isolated subsystems collaborating through message protocol.
So when looking at directory, better not just see “where files are,” but see “which layer holds which powers.”
2. Chrome MV3’s 5-Layer Script Architecture
These 5 layers really solve security contradiction in browser extensions:
- dApp needs easy-to-access wallet interface
- But sensitive state can’t be exposed to arbitrary page scripts
So architecture deliberately split into:
- Page layer is discoverable
- Relay layer can communicate
- Background layer can dispatch
- Keeper layer can keep secret
- Approval page lets user make final confirmation
Chrome MV3 extension isolation boundaries and communication methods between scripts:
┌──────────────────── Browser Tab (Web Page)───────────────────────┐
│ │
│ dApp (Web Page JavaScript) │
│ ↕ wallet-standard API (same process call) │
│ injected.ts ← Injected into page process by content.ts │
│ EveVaultWallet class registered to @mysten/wallet-standard │
└───────────────────────────────────────────────────────────────┘
↕ window.postMessage (cross-process)
┌──────────────────── Chrome Extension Process ────────────────────┐
│ content.ts (content script) │
│ Forward: page → background │
│ Forward: background → page │
└───────────────────────────────────────────────────────────────┘
↕ chrome.runtime.sendMessage
┌──────────────────── Service Worker ────────────────────────────┐
│ background.ts │
│ OAuth flow, Token exchange, Storage management │
│ Handle signing requests (forward to Keeper) │
│ ↕ chrome.runtime Port │
│ keeper.ts (hidden iframe, memory security container) │
│ Store ephemeral private key (not written to chrome.storage) │
└─────────────────────────────────────────────────────────────────┘
↕ chrome.runtime.sendMessage
┌──────────────────── Extension Pages ───────────────────────────┐
│ popup/ ← Displayed when clicking extension icon │
│ sign_transaction/ ← Transaction approval popup │
│ sign_sponsored_transaction/ ← Sponsored transaction approval │
│ sign_personal_message/ ← Message signing approval │
└─────────────────────────────────────────────────────────────────┘
3. Message System (Message Protocol)
Why is message protocol this extension system’s lifeline?
Because this extension doesn’t rely on direct function calls, but is driven by cross-process messages.
Once message types, field semantics, or response contracts become messy, hardest-to-debug problems appear:
- Page seems to send request
- Background also received
- But keeper or approval page returned semantics already inconsistent
So in this type of system, message protocol itself is “interface standard.”
All cross-process communication through standardized message type definitions:
// packages/shared/src/types/messages.ts
// Auth-related messages
export enum AuthMessageTypes {
AUTH_SUCCESS = "auth_success",
AUTH_ERROR = "auth_error",
EXT_LOGIN = "ext_login",
REFRESH_TOKEN = "refresh_token",
}
// Vault (encryption container) messages
export enum VaultMessageTypes {
UNLOCK_VAULT = "UNLOCK_VAULT",
LOCK = "LOCK",
CREATE_KEYPAIR = "CREATE_KEYPAIR",
GET_PUBLIC_KEY = "GET_PUBLIC_KEY",
ZK_EPH_SIGN_BYTES = "ZK_EPH_SIGN_BYTES", // Sign with ephemeral private key
SET_ZKPROOF = "SET_ZKPROOF",
GET_ZKPROOF = "GET_ZKPROOF",
CLEAR_ZKPROOF = "CLEAR_ZKPROOF",
}
// Wallet Standard related (dApp triggered)
export enum WalletStandardMessageTypes {
SIGN_PERSONAL_MESSAGE = "sign_personal_message",
SIGN_TRANSACTION = "sign_transaction",
SIGN_AND_EXECUTE_TRANSACTION = "sign_and_execute_transaction",
EVEFRONTIER_SIGN_SPONSORED_TRANSACTION = "sign_sponsored_transaction",
}
// Keeper security container messages
export enum KeeperMessageTypes {
READY = "KEEPER_READY",
CREATE_KEYPAIR = "KEEPER_CREATE_KEYPAIR",
UNLOCK_VAULT = "KEEPER_UNLOCK_VAULT",
GET_PUBLIC_KEY = "KEEPER_GET_KEY",
EPH_SIGN = "KEEPER_EPH_SIGN", // Ephemeral private key signing
CLEAR_EPHKEY = "KEEPER_CLEAR_EPHKEY",
SET_ZKPROOF = "KEEPER_SET_ZKPROOF",
GET_ZKPROOF = "KEEPER_GET_ZKPROOF",
CLEAR_ZKPROOF = "KEEPER_CLEAR_ZKPROOF",
}
Message Flow: dApp Signing Request Complete Path
dApp calls wallet.signTransaction(tx)
↓ wallet-standard (same process)
injected.ts (EveVaultWallet.signTransaction)
↓ window.postMessage({ type: "sign_transaction", ... })
content.ts
↓ chrome.runtime.sendMessage(...)
background.ts (walletHandlers.ts)
→ Open sign_transaction approval window
← User clicks "Approve"
→ Send message to Keeper
↓ chrome.runtime Port
keeper.ts
→ Sign with ephemeral private key
→ Return ZK Proof + signature
↓ chrome.runtime Port
background.ts
↓ chrome.runtime.sendMessage
content.ts
↓ window.postMessage
injected.ts
→ Return SignedTransaction to dApp
4. Wallet Standard Implementation (SuiWallet.ts)
EVE Vault implements @mysten/wallet-standard’s Wallet interface, letting all dApps supporting Wallet Standard automatically discover it:
// apps/extension/src/lib/adapters/SuiWallet.ts
export class EveVaultWallet implements Wallet {
readonly #version = "1.0.0" as const;
readonly #name = "Eve Vault" as const;
// Supported Sui network chains
get chains(): Wallet["chains"] {
return [SUI_TESTNET_CHAIN, SUI_DEVNET_CHAIN] as `sui:${string}`[];
}
// Implemented Wallet Standard features
get features() {
return {
[StandardConnect]: { connect: this.#connect },
[StandardDisconnect]: { disconnect: this.#disconnect },
[StandardEvents]: { on: this.#on },
[SuiSignTransaction]: { signTransaction: this.#signTransaction },
[SuiSignAndExecuteTransaction]: { signAndExecuteTransaction: this.#signAndExecuteTransaction },
[SuiSignPersonalMessage]: { signPersonalMessage: this.#signPersonalMessage },
// EVE Frontier proprietary extension feature
[EVEFRONTIER_SPONSORED_TRANSACTION]: {
signSponsoredTransaction: this.#signSponsoredTransaction,
},
};
}
}
Register to Page (injected.ts)
// apps/extension/entrypoints/injected.ts
import { registerWallet } from "@mysten/wallet-standard";
import { EveVaultWallet } from "../src/lib/adapters/SuiWallet";
// Register immediately on page load
registerWallet(new EveVaultWallet());
dApps automatically discover EveVaultWallet through @mysten/wallet-standard’s getWallets(), no special integration needed.
5. Keeper: Security Key Container
Keeper is EVE Vault’s most unique security design — ephemeral private key never leaves Keeper process memory:
// apps/extension/entrypoints/keeper/keeper.ts
// Message types Keeper handles
switch (message.type) {
case KeeperMessageTypes.CREATE_KEYPAIR:
// Generate new Ed25519 ephemeral key pair
// Private key only in memory, not written to chrome.storage
break;
case KeeperMessageTypes.EPH_SIGN:
// Sign bytes with ephemeral private key
// Only expose signature result, not private key
break;
case KeeperMessageTypes.CLEAR_EPHKEY:
// Clear ephemeral private key in memory (lock operation)
break;
}
Security Guarantee:
- Ephemeral private key = memory variable, not serialized to chrome.storage
- Browser closes or Keeper crashes → Private key auto-destroyed
- Re-unlock → Regenerate new ephemeral key pair
- Background/Popup cannot directly read private key, can only request signing through Port messages
Keeper’s most important aspect isn’t “mystery,” but permission minimization.
It compresses most sensitive capability into very few things:
- Generate ephemeral key
- Sign with ephemeral key
- Clear ephemeral key
Beyond this, other layers try not to touch private key body.
6. Local Development Configuration
Install Dependencies
# Recommend using Bun
bun install
Configure .env
# apps/extension/.env
VITE_FUSION_SERVER_URL="https://auth.evefrontier.com"
VITE_FUSIONAUTH_CLIENT_ID=your-fusionauth-client-id
VITE_FUSION_CLIENT_SECRET=your-fusionauth-client-secret
VITE_ENOKI_API_KEY=your-enoki-api-key
EXTENSION_ID="your-extension-public-key"
Start Development Mode
# Only run extension (recommended)
bun run dev:extension
# Run all apps (extension + web)
bun run dev
In development mode, WXT generates extension files in apps/extension/.output/chrome-mv3/, monitors file changes for auto-rebuild.
Load Extension in Chrome
- Open
chrome://extensions - Enable “Developer mode” in top-right
- Click “Load unpacked”
- Select
apps/extension/.output/chrome-mv3/
After each file change, Chrome auto-detects and prompts update (no manual reload needed).
7. Build Production Version
# Build Chrome extension
bun run build:extension
# Output: apps/extension/.output/chrome-mv3.zip
# Build all apps
bun run build
# Clear all cache (use when build time becomes slow)
bun run clean
8. FusionAuth OAuth Configuration
In FusionAuth console need to add following redirect URI (fixed format):
https://<extension-id>.chromiumapp.org/
Extension ID is Chrome-assigned unique identifier for extension (can be found on chrome://extensions page).
Necessary OAuth scopes (Scopes):
openid(get JWT format token)profile(get user info)email(user email)
9. Turborepo Build Cache
Project uses Turborepo to accelerate builds:
# turbo.json defines task parallel relationships
# build:extension depends on shared package build
bun run build:extension
# → First build packages/shared
# → Then build apps/extension (uses cache)
# Force rebuild (ignore cache)
bun run build --force
10. E2E Testing
# tests/e2e/ directory contains end-to-end tests like balance queries
bun run test:e2e
# Before testing need wallet logged in and test account configured
# tests/e2e/helpers/state.ts provides state management tools
Chapter Summary
| Component | Layer | Function |
|---|---|---|
injected.ts | Page process | Register EveVaultWallet to Wallet Standard |
content.ts | Content script | Message bridge: page ↔ Background |
background.ts | Service Worker | OAuth, storage, request coordination |
keeper.ts | Hidden container | Ephemeral private key’s secure storage and use |
popup/ | Extension Page | User interface: login, address, balance |
sign_*/ | Extension Pages | Transaction/message approval UI |
SuiWallet.ts | Adapter | Wallet Standard complete implementation |
Next Chapter: Future Outlook — ZK proofs, full decentralization, and EVM interoperability possibilities for EVE Frontier and Sui ecosystem.
Chapter 35: Future Outlook — ZK Proofs, Full Decentralization and EVM Interoperability
Objective: Understand cutting-edge technical directions for EVE Frontier and Sui ecosystem, think about how to prepare architecture for future key upgrades in advance, become builders at technology frontier.
Status: Outlook chapter. Text focuses on future technical directions and architectural preparation.
35.1 Current Trust Assumptions and Limitations
Reviewing core “trust assumptions” throughout our course architecture:
| Component | Current Dependency | Limitations |
|---|---|---|
| Proximity verification | Game server signature | Server can lie or go down |
| Location privacy | Server doesn’t leak hash mapping | Server knows all locations |
| Component state updates | Game server submission | Centralized bottleneck |
| Game rule modifications | CCP-controlled contract upgrades | Players have no direct governance rights |
These limitations aren’t design failures but current technology and engineering tradeoffs. EVE Frontier official roadmap promises to gradually eliminate these centralized dependencies.
This chapter easiest to write as “technology vision list,” but truly valuable perspective is:
Which future directions worth leaving interfaces for today, which just need awareness, no premature commitment.
Because for Builders, if future sense handled poorly, becomes two common problems:
- Over-preparation, system actually bloated today
- Complete non-preparation, future changes require refactoring
35.2 Zero-Knowledge Proof (ZK Proofs) Application Prospects
What are ZK Proofs?
Zero-knowledge proofs allow one party (Prover) to prove something is true to another party (Verifier), without revealing any specific information:
Current (server signature):
Player → "I'm near stargate" → Server queries coordinates → Sign proof → On-chain verify signature
Future (ZK proof):
Player locally calculates: "Generate a ZK proof proving I know coordinates (x,y),
such that hash(x,y,salt) = hash stored on-chain,
and distance(x,y, stargate) < 20km"
→ Submit ZK proof on-chain
→ Sui Verifier smart contract verifies proof (no server needed)
ZK Significance for EVE Frontier
Now Future (ZK)
────────────────────────────────────────────────
Proximity → Server signature Proximity → Player self-proving ZK
Location privacy → Trust server Location privacy → Mathematical guarantee
Jump verification → Need server online Jump verification → Fully on-chain
Off-chain arbitration → CCP decisions Off-chain arbitration → Community DAO
Contract Design Prepared for ZK
// Now: Use AdminACL to verify server signature
public fun jump(
gate: &Gate,
admin_acl: &AdminACL, // Now: Verify server sponsor
ctx: &TxContext,
) {
verify_sponsor(admin_acl, ctx); // Check server in authorized list
}
// Future (ZK era): Replace verification logic, business code unchanged
public fun jump(
gate: &Gate,
proximity_proof: vector<u8>, // Switch to ZK proof
proof_inputs: vector<u8>, // Public inputs (location hash, distance threshold)
verifier: &ZkVerifier, // Sui's ZK verification contract
ctx: &TxContext,
) {
// Same on-chain ZK proof verification
zk_verifier::verify_proof(verifier, proximity_proof, proof_inputs);
}
Key Architecture Recommendation: Now encapsulate location verification as independent function, future only need replace verification logic, no need rewrite business code.
For Today’s Builders, What’s ZK’s Most Realistic Value?
Not immediately writing proof systems yourself, but first learning to separate “proof mechanism” from “business state machine.”
This way if future:
- Server signature switches to ZK
- Some verification steps become locally generated proofs
- Different components use different proof backends
You’re replacing verification layer, not entire product logic.
35.3 Fully On-Chain Game
Blockchain gaming’s ultimate form: Game logic entirely on-chain, no centralized servers.
Ideal fully on-chain game:
All game state → On-chain objects
All rule execution → Move contracts
All randomness → On-chain randomness (Sui Drand)
All verification → ZK proofs
All governance → DAO voting
Sui Drand: On-Chain Verifiable Randomness
use sui::random::{Self, Random};
public fun open_loot_box(
loot_box: &mut LootBox,
random: &Random, // Random number object provided by Sui system
ctx: &mut TxContext,
): Item {
let mut rng = random::new_generator(random, ctx);
let roll = rng.generate_u64() % 100; // 0-99 uniform distribution
let item_tier = if roll < 60 { 1 } // 60% common
else if roll < 90 { 2 } // 30% rare
else { 3 }; // 10% epic
mint_item(item_tier, ctx)
}
On-Chain AI NPC (Experimental)
Combined with ZK machine learning (ZKML), theoretically NPC decision logic can also go on-chain:
On-chain NPC contract → Receive game state inputs
→ Verify "AI decision correctness" on-chain via ZKML
→ Output action results
Need Most Realistic Judgment Here
“Fully on-chain” doesn’t automatically equal “more suitable for current EVE Builder tasks.”
Many truly valuable products today are still hybrid architectures:
- Key assets and rules on-chain
- High-speed world simulation stays off-chain
- Verification boundaries gradually move forward
So more practical goal usually isn’t achieving full on-chain in one step, but continuously shrinking area of “must rely on centralized trust.”
35.4 Sui and Other Ecosystem Interoperability
Sui Bridge: Cross-Chain Assets
// Future: Transfer EVE game items from Ethereum via Sui Bridge
const suiBridge = new SuiBridge({ network: "testnet" });
// Bridge some NFT from Ethereum to Sui
await suiBridge.deposit({
sender: ethAddress,
recipient: suiAddress,
token: ethNftContractAddress,
tokenId: "12345",
});
State Proof
Sui supports proving its own on-chain state to other chains, making cross-chain asset proofs possible:
EVE Frontier player owns rare ore (Sui)
→ Generate Sui State Proof
→ Use Sui asset as collateral on Ethereum DEX
Most Worth Watching for Interoperability Isn’t “Can It Bridge,” But “After Bridging Does Semantics Still Match”
For example:
- Does an EVE asset on another chain still have same permissions or items?
- Does another chain’s financial scenario understand its real risks?
- When bridge fails, freezes, rolls back, how do users understand asset state?
This means cross-chain isn’t pure technical extension, also a layer of product semantic migration.
35.5 DAO Governance: Builders Participate in Game Rule Making
As game matures, more game parameters may open to DAO voting:
// Future: Fee parameters decided by DAO vote
public fun update_energy_cost_via_dao(
new_cost: u64,
dao_proposal: &ExecutedProposal, // Passed DAO proposal credential
energy_config: &mut EnergyConfig,
) {
// Verify proposal passed and not expired
dao::verify_executed_proposal(dao_proposal);
energy_config.update_cost(new_cost);
}
Not All Parameters Worth DAO-izing
More suitable for DAO usually:
- Medium-long term rule parameters
- High-value public resource allocation
- Multi-party interest profit distribution and governance items
Not quite suitable for full DAO-ization usually:
- High-frequency operational parameters
- Security switches needing second-level response
- Daily actions clearly belonging to execution layer responsibilities
Otherwise governance transforms from “collective decision” to “system blocking.”
35.6 Long-Term Advice for Builders
Technology Choices
✅ Do now:
- Encapsulate verification logic as replaceable modules
- Use dynamic fields to reserve extension space
- Leave DAO governance parameter interfaces
- Keep contracts modular for easy upgrades
🔮 Technology directions to watch:
- Sui ZK Proof native support
- Sui Move's type system extensions
- Cross-chain bridge security maturity
- ZKML's practical applications in gaming
Business Positioning
Short-term (doable now):
- Stargate fees, markets, auctions and other economic systems
- Alliance collaboration tools (dividends, governance)
- Game data statistics dashboards and analysis services
Medium-term (1-2 years):
- Multi-tenant SaaS platform (universal market, quest framework)
- Cross-alliance protocols and standards
- Data analytics and business intelligence
Long-term (after ZK matures):
- Fully decentralized game instances (mini-games within game)
- ZK-driven privacy trading
- Cross-chain EVE asset financialization
Real Long-Term Advice Can Compress to One Sentence
First make today’s truly deliverable systems modular, upgradable, clear boundaries, then welcome future capabilities.
Because future truly rewards not “who shouted slogans earliest,” but “whose today’s system easiest to evolve into tomorrow.”
35.7 This Course’s End Is Next Starting Point
Congratulations on completing EVE Frontier Builder complete course! You now have:
- ✅ Move Contract Development: From basics to advanced patterns
- ✅ Smart Assembly Modification: Complete APIs for turrets, stargates, storage boxes
- ✅ Economic System Design: Tokens, markets, DAO governance
- ✅ Full-Stack dApp Development: React + Sui SDK + real-time data
- ✅ Production-Grade Engineering: Testing, security, upgrades, performance optimization
Next Actions:
- Complete 10 Practice Cases, transform knowledge into deployable products
- Join Builder Community, share your contracts, participate in ecosystem building
- Follow Official Updates, Sui and EVE Frontier continuously evolving
- Build Your Own Universe, here code is physical laws
“We’re not just writing code. We’re establishing physical laws for a universe.”
— EVE Frontier Builder Spirit
📚 Final Reference Resources
- EVE Frontier Official Site
- Official Documentation
- World Contracts Source Code
- Sui Technical Documentation
- Move Book
- Sui ZK Related
- Sui On-chain Randomness
- EVE Frontier Discord
Execution Guide: How to Run Example dApps
100 Sui Core Feature Ideas
#EVE Frontier 2026 Hackathon: Sui Core Feature Creative Library
This catalog features 100 hardcore ideas designed specifically for the EVE Frontier x Sui Hackathon. Each idea is deeply integrated with the underlying advantages of Sui (such as PTB, Dynamic Fields, Kiosk, DeepBook, etc.), and is assigned independent files for further refinement of the architecture and implementation solutions.
Creative list
- 太空闪电贷 (Space Flash Loan) - 单笔交易直接借船-采矿-跨星区高价兜售矿产-最后还船还息,若中间任何一步亏损或被击毁则全局时空倒流回…
- 舰队同步空投矩阵 - 一个 PTB 打包 100 名玩家的独立跳跃请求签名,确保所有人不在有任何延迟的情况下于同一个 Su…
- 复合毁灭重铸工厂 - 在一个 PTB 内同时要求销毁(Burn)5种分布在宇宙各地的不同零件,瞬间极速原位组装(Mint)…
- 一键脱壳求生方案 - 当主舰濒临爆炸的毫秒间,PTB 原子级执行卸载身上最值钱的高级雷达设备并转移进逃生舱对象中火速弹射…
- 多签联合悬赏资金池 - PTB 串联多重合约逻辑验证军团高层的多方签名后,才瞬间释放联盟保险箱底层的无尽物资用于犒赏三军…
- 过路费无感代付系统 - 大舰队雇主代替佣兵支付星门跳过路费,同时该佣兵将刚采集的矿石在同一个 PTB 内按比例反切划转给雇主…
- 极速无缝换装快切板 - 将全套飞船机甲从“采矿特化型”零延迟一键换肤拔换所有组件切换成“防空堡垒型”…
- 跨合约截胡套利脚本 - 利用 PTB 的前后调用链接,在这个星门低价收废料过账的同一纳秒在隔壁当做高价燃料倒满卖出…
- 自动均分平衡油箱网 - 通过 PTB 循环读取整支舰队 50 艘所有战舰的燃料罐状态,瞬间抽取高残余并平均调配给即将干涸的船…
- 破防暴击叠加引擎 - 在一个单一逻辑交易块之内极其夸张地连续高频调用炮塔 20 次射击函数,达成数学意义上完美的连击暴击蓄…
- 无尽外挂外设插槽 - 打破固定飞船几个装配位的传统概念,利用动态对象字段 (DOF) 实现无限量叠加装配微型外设扫描仪…
- 高度加密的黑匣子日志 - 将跳跃和交战敏感坐标历史直接利用 DF 深埋入飞船的极深子域结构,没有特定的检索秘钥绝对连扫都扫不出…
- 打破体积限制的套娃虫洞储存 - 由于每个储物箱因为代码有容量极限,利用 DOF 一层层做子域嵌套实现无限套娃的超级仓库空间…
- 鲜血进化的成长武器 - 根据炮管实际消灭人数,合约动态增加一个名为
veteran_stats的额外字段来凭空拔高它的… - 不染核心的军团痛车涂装 - 不更改船只自身高昂昂贵的防御架构主体模块,而是只将其外部材质包的链接随时通过增删 DF 来替换展现风…
- 极其恶毒的太空寄生虫病 - 敌方不仅是进行攻击,更是将一连串病毒字段强行插入你的飞船底层 DOF 中变成每小时都吸取护盾的不可解…
- 全员通报动态赏金刺配贴 - 这并非系统机制,而是给全宇宙极度犯众怒的海盗飞船脑门上被全权打上并挂载永远闪亮的红名动态悬赏标签…
- 柔性不关机税控升级台 - 不需要停止机器打补丁,星门依据通过船只,直接在底层随时以增删字段的形式扩充不同船型的精确税率小抄条款…
- 临时日抛口令防线大钥锁 - 给本公会成员贴上时效只有半天的身份特征子字段,大白银星门仅此才敞开…
- 防爆裂脱挂反应装甲墙 - 当飞船承受完全无法阻挡的一击致命炮火,底层逻辑主动解开并粉碎自身外部连接好的防御场对象用作绝对抵挡…
- 企业内网专属星矿局 - 设定只允许后缀为 “@amazon.com” 或者 “@sui.io” 的专属联盟企业员工靠 zkL…
- Twitch 粉头门禁卡大派发 - 通过与 Twitch SSO 结合的 zkLogin 取信确认,当红游戏主播能极其轻松通过这种协议将…
- Discord 社群绝对权力挂钩 - 将会长的游戏内顶级操作员大锁绑定到 Discord ID 上。如果换届在群里发生变更游戏内政权无缝由…
- 了无痕迹的无名卧底信使 - 不创建繁琐的 Web3 链子交互,直接利用极高匿名和用完即抛特点随便乱填临时邮箱创建超脱主号外且无法…
- 防肉鸡挂机邮箱强验证门 - 那些刷子和脚本当试图穿越极度肥沃且严密封锁核心带星门时会弹跳弹窗必须到绑的实体邮箱收一个动态验证码进…
- 地球地缘锚定局域国战 - 运用 Web2 天然能够捕抓物理定位与 IP 并和该次验证登录进行硬锚定,强行构建一个限制比如北美玩…
- 极其小白甚至白痴的傻瓜一键找回遗产库 - 由于极其简单依靠 OAuth 和社交账号的背书验证特性。即便被清空密码新手也能一键光复自己当初那条破…
- 跨不同游历世界同源生态大成就共享箱 - 通过这一同源技术接口你如果在以太坊曾经某个链游是个屠龙大师系统查验直接跨游发放你在这个大乱斗服一台黄…
- 防盗库线下现实闹钟联动警报器触发极点 - 这甚至不是游戏内的炮火!一旦星门被强行爆破黑客攻击它直接调回 Web2 直接拨打这账号身后预存现实手…
- 成年人分级豪赌深暗冰冷不设防极恶区域网关 - 这绝对不会放进小孩去因为这里涉及极大血腥残忍并且极大 SUI 大额搏杀,进入大门利用此必须强过政府级…
- 宇宙洗牌跳楼机虫洞引擎 - 抛开航道把命压上全扔进去!闭眼推杆之后,将有一半可能一飞冲天抵达满是资源伊甸园或者当场掉进全是恶鬼星…
- 致残极其拼人品薛定谔急救补给包修理台 - 投入天价不二价 10 SUI:里面有 50% 极其逆天拉满机体到极品战斗满编全状态;但甚至有 50%…
- 暴走跳弹几率折射偏移仪强化塔 - 它不是每次给加成 5% 极其枯燥设定;开火伤害极其看脸每一次炮击可能只打出零星抓痒数值也可能暴起一击…
- 海选捡垃圾废墟大奖盲盒打捞钩爪臂 - 在清扫成千上万一堆破烂残骸碎片里混杂极大极微弱的抽奖池掉落库。没准一抓钩下去你成为了全宇宙那个抽得最…
- 极其残酷且没道理恶劣外星宇宙引力天气系统台 - 每日正午由链上掷出的这惊天大骰子来强制决定这一天是会发生全宇宙掉半血削弱以及大引力减速和各种恶狠狠负…
- 完全无逻辑瞎逛流窜神秘财宝飞箱怪 - 系统彻底解放由这个真随机乱跑到处生成大黑箱;你需要到处跑甚至这宝物怪自己都不知自己将刷新在哪,极大丰…
- 只能活出一人绝命毒师对决生还转盘绞肉门 - 两个已经把身家性命打烂甚至结下死仇大联盟长不仅单挑并进密室锁定然后彻底交由程序抽签,三秒后大门打开一…
- 全盘皆乱大洗牌矿带成分狂乱变异变点重生地带 - 每天只要开采枯竭,系统就会通过摇号随机去彻底把下次不仅重生纯度、出产、连带种类元素像乱配对一样极其杂…
- 大轮盘恶霸强盗海关碰碰运气全免或者被彻底搜刮榨取交费站 - 他不仅连个固定定价都不贴你要想过去你就自己按大转轮!如果小可能彻底免费让你开心离去,但要是大那是毫不…
- 自适应变色龙绝境反弹反击大装甲层防卫甲 - 每次受到伤害时在接受瞬间由那个随机转数瞬间当场改变其各种属性抗性反克,不仅能反克甚至还会弹出各种极其…
- 入会即签卖身契换来的傻瓜全免打工特惠包租卡 - 新矿工一穷二白?军团大老板帮你包下你在里面的每一铲子甚至每一个呼吸每一脚油门费用,全部走它那深不见底…
- 免费大路看似宽阔且极其阴毒深渊大门伪善剥皮星区过站台 - 星门挂大喇叭完全不收你一分过门路水费极大勾引你经过,但在条款微小并且极其难发现的深处通过你的授权抽走…
- 商城极致体验免车船税狂飙极测试飞车场 - 给你顶配好船但是限定的不是燃料而是极具只在当前系统和这极大极其狂野的一个钟头试驾时间里,你疯狂甩炮和…
- 大爱无疆公益超速保险死无全尸回家大火箭秒级复苏单 - 当极具惨烈被爆机瞬间连全尸都没留下,红叉公益系统不需要由于你在这个时刻连按出来的仅有几毫厘的气数直接…
- 拉皮条裂变传销吸粉体验巨主播引流金大管道链 - 主播为了增加观众群只要你点击那个入驻小字号哪怕是什么都不懂摸黑砸砸开两枪,那些全由主播大后台直接全赞…
- 纯善贫民窟哪怕低安区甚至是那些垃圾聚集大黑洞免费大低保充电急救区 - 在这个极其惨绝人寰绝境下这是一个散发神性光辉纯由大佬建立的免费急救区,不仅完全免费而且由于极其可怜完…
- 挂包极其彻底托管后台机器人无脑大打钱脚本协议挂载插件 - 免除了每天自己各种烦躁点击,哪怕是你用来跑挂机脚本文消耗系统运算摩擦费用全让你那个托管的无所不包的后…
- 全包揽订阅大金主星空大土豪畅玩这星区专属尊贵 VIP 通行全免黑卡机制 - 交足大这天价 SUI 保镖月租金之后你哪怕在这个大指定区里每一秒按住这镭射炮狂轰滥炸扫了几千上万次都…
- 终极全服总动员并且不惜一切代价哪怕打空大金库燃烧到底全战争机器打响不计这极其庞大国力损耗大豁免打通特批最高总法令 - 当极具两方全服大决战这个统率直接拉开最牛的极其恐怖智能赞助条令让底层敢死队能够毫无顾虑放肆极其夸大超…
- 极其贴切且符合各种巨大联盟不仅出纳以及会计极为严丝合缝报销并且甚至极大体现后勤大体制完全报表对账补偿大系统台应用 - 完美解决由于引公采办一草一木由于因公导致这个耗损甚至是这磨损完全自动化报销补偿给前线打工人的智能福利…
- 垄断式战争武装专营店 - 顶级火炮只能从我开设的 Kiosk 中买,且任何人之后倒卖这尊火炮,都会被原制造者无视任何市场系统强…
- 纯血军团内部黑心国营商会 - 利用 Kiosk 设置白名单条件,这儿的平价军需只能验证你是否佩戴公会特定徽章才能进行购买,外人花一…
- 绝地禁售防黄牛大锁限令 - 限定某些超级功勋战舰 NFT 就算在 Kiosk 里挂售也无法被任意转移,彻底切断一切二手市场交易和…
- 有时效的不见光租赁黑展柜 - 将飞船放进 Kiosk 设定“Rent”借出给别人开,租期结束不仅它无论宇宙何处都会在瞬间被强制剥夺…
- 密封盲拍极品暗网拍卖行 - 利用 Kiosk 特性和密文出价,暗拍极度稀缺的绝版高级星门建城执照,价高者得但互不知道底牌…
- 强制分润大海盗分赃协定 - 这几个海盗团队将抢来的船只挂在 Kiosk 出售,规定售出资金强行瞬间无误差平分为 5 份打给 5 …
- 跨服恶霸星门承包权流转 - 这口大星门的管理和所有权作为 Kiosk 展品上架,任何土豪只要有钱都可以用预设极高违约溢价强行盘下…
- 纯粹摆显土豪艺术家飞船玻璃大展厅 - 不卖只炫耀!专攻审美的土豪购买全服第一台艺术旗舰放入不提供贩售价格仅仅供人仰望并设置成只要点赞才能查…
- 自适应跳楼折旧大甩卖二手黑车行 - 一旦出过事故每经过一次转手倒卖或者损坏修理再进入,Kiosk 的挂盘出售系统标价会自动由于战损痕迹强…
- 隐去真名大黑市洗钱符文深空网 - 挂上 Kiosk 时强行利用隐私或者代理逻辑去剥离卖家真实交易地址的深网黑市流转,从而完全规避任何大…
- 即时无滑点燃料挂单闪兑深池 - 不需要慢悠悠匹配,通过 DeepBook 构建全链的矿物/燃料交易对大单墙,矿工拉着矿船直接一键按最…
- 超高频极速差价真空无损搬砖机 - 全自动化监听运行于各个 DeepBook 接口极远星系之间的矿石微小差价,通过算法控制飞船穿梭疯狂做…
- 暗夜战争物资恐慌期货做空机 - 拥有绝密情报预判敌方即将惨败而且资源大崩溃,赶在消息走漏前在 DeepBook 上面用高杠杆挂上并且…
- 巨无霸军团级国家大金库资金护盘做市 - 为了公会大计直接将联盟税收那些富可敌国海量 SUI 和海量燃料组成的特大池子注入特定 DeepBoo…
- 绝命限价止损斩仓连逃带跑逃生舱 - 当飞船处于危险血量大残即将全军覆没时,通过连接程序设定触发特定危急价位会自动通过连入该星门 Deep…
- 战时即崩极高压借贷大平仓引擎 - 你抵押了星门贷款买巨炮去装逼,当敌方开始攻打你的星系,抵押大门由于地价剧烈贬值,这个 DeFi 合约…
- 打爆不仅人死甚至瞬间还要大爆仓破产倾家荡产大杠杆狂赚合约 - 这帮亡命赌徒抵押飞船开上了 100 倍恐怖杠杆去博 SUI 涨跌大走势,一旦看走眼不仅在链上你血本无…
- 极度跨星区全联盟大通证跨服资产综合权重大指数基金 - 将其全部打包直接发行一款包含这些最高级矿物指数成分汇总而且极大分量的联合大 ETF 代币公然送上了交…
- 全链反向对冲大灾难黑天鹅大爆船险 (Credit Default Swap) - 一旦那个最大土豪特定那一艘史诗级战列舰极其意外遭毁不仅会立刻触发兑现巨额期权衍生品超级赔付资金大套死…
- 全能极限聚合大 DEX 通道吃尽天下星门 - 任何人通过星门时,这机器不仅自动给你传送而且会在全网各种价单上给你匹配寻找最优价,将你过往垃圾残碎矿…
- 易读导航指路信标超级大站长定位 - 用极其清晰的
base.alliance.sui取代那些让人不可能记住的繁杂大合约极大对象这一… - 全游最高通缉大红榜 - 只要向全系统广播
kill.boss.sui这样简单粗暴且悬赏极高惹眼的大域名,所有人立刻闻风… - 跨次元超级大金主星门网冠名 - 财力极佳的巨型联盟不仅自己盖星门更是将其命名为现实著名商业大品牌
redbull.sui,借此拿… - 真假大将军防伪克隆标识铭牌 - 最高统帅直接绑定
commander-john.sui杜绝敌人使用假马甲乱入频道发号施令… - 国别改朝换代全网交接大系统 - 由于一切公国资产大权直接绑定在
king.blood.sui之上,转手该域名意味着瞬间将整个王… - 极紧迫生死相搏求救短地址 - 快被打爆没空输入参数时候直接公屏输入
help.me.sui一键全军驰援防线… - 太空冷笑话解压点智能分发仪 - 偏远停机坪的
tell.joke.sui里面不仅仅会自动广播各种太空段子专供无聊玩家挂机时解闷… - 绝对公开历史透明记名外交长廊 - 利用全服唯一
diplomacy.guild.sui这里无情且无法挂除地挂满了极其耻辱毁约背刺… - 极其无耻黑吃黑夺命深空伏击网 - 黑客通过篡改全服极度信任的地标级安全域名比如
home.sui的底层映射点,将一整支大舰队瞬间… - 完全树形的军团阶层超级分册名录网 - 使用
.admiral.fleet1.alliance.sui的绝妙域名多重后缀极其完美展示这… - 硬核去中心化不可磨灭星系通史全大录像 - 高达几十GB全记录的宇宙战争编年史战争录屏不再放 YouTube,直接用合约挂载进 Walrus 库…
- 赛博狂野百兆机甲痛车喷漆超清全图列阵 - 将毫无上限超高清的个性暴走飞船装甲极大材质包全传至 Walrus 以供整个游戏进行不受带宽限制的超逼…
- 拯救全服失传开源极品深空图纸库 - 一个永不丢失的海量由于年代久远或者早就全服没人懂了的超级远古飞船设计蓝图保存在 Walrus 中而成…
- 各种极品机密由于战线潜入之天价情报网赚 - 极其高密大舰队全方位集结暗中潜行极清百兆偷拍大视频保存在 Walrus 后并锁定只有极其极其天价密匙…
- 星际广播电台与航线播报网 - 把联盟广播、战争简报、航线预警和广告赞助做成可订阅、可打赏、可归档的深空媒体系统…
- 开源舰队 AI 策略仓库 - 让炮塔策略、物流调度、价格模型和联盟战术模板变成可授权、可订阅、可分成的规则插件市场…
- KillMail 取证回放台 - 围绕击杀记录做录像索引、赔付取证、战术复盘和战争教学的公开回放平台…
- 热土豆通缉信标 - 用 Hot Potato 思路做高风险追猎赛事、逃亡挑战和限时传递型 PvP 节目…
- 共享矿带抢采协议 - 把整条矿带做成共享资源池,让多人并发争夺、协作和干扰同一批高价值资源…
- 永久战争纪念碑 - 将联盟胜利、远征和重大牺牲铸造成不可篡改的纪念对象,而不是不合理的无敌要塞…
- 只认舰长的私有旗舰 - 设计强身份绑定的旗舰控制系统,解决高价值舰船的授权、封存和防劫持问题…
- 旗舰试驾与限时借舰库 - 用 Borrow 模式做高价值舰船的体验、教学、赛事赞助和押金租赁系统…
- 联盟多签金库与军费保险箱 - 把军费、税收和战时预算做成多签审批、可审计、防卷款跑路的联盟财务系统…
- 全服倒计时争夺战 - 基于链上时钟做公开倒计时的资源争夺、战争窗口、限时拍卖和撤离结算玩法…
- 蓝图母版与版税工厂 - 让舰船、装备和装饰蓝图以母版授权形式生产,并持续向原创者分发版税…
- 诅咒突变古神兵 - 设计会随击杀、重铸和献祭不断突变、变强也可能反噬主人的危险神兵系统…
- 红名自动截杀网 - 围绕红名、赏金和信誉数据,把多座星门和炮塔联成一张自动协防与封锁网络…
- 组件众筹超级战舰 - 让多人分组件认购、组装、运营和分摊战损,打造联盟级超级战舰项目…
- ZK 跨链身份映射 - 在不暴露完整隐私的前提下证明其他链或其他世界中的身份、成就和信誉…
- 治理权碎片寻回战役 - 把治理权抽象成安全的赛季事件碎片,让玩家通过协作占点、护送和解谜争夺治理资格…
100 General Comprehensive Ideas
#EVE Frontier 2026 Hackathon: General Idea Library
This catalog contains 100 regular hackathon ideas based on the EVE Frontier ‘A Toolkit for Civilization’ theme. These ideas cover five major areas: practical tools, technical architecture, imaginative creativity, weird gameplay, and real-time server linkage.
Creative list
- 智能星区燃料调度仪 (Smart Resource Router) - 监听全星区星门的跃迁频率,自动将联盟运输船的物流目的地修改为燃料库存低于 20% 的节点,确保整个防…
- 极危求救信标 (Automated SOS Beacon) - 改造储物单元(Storage Unit),当其装甲值(HP)低于 30% 时,智能合约自动在公共频道…
- 去中心化太空典当行 (Space Pawnshop) - 允许玩家将全服限量的 NFT 涂装或稀有蓝图锁入智能储物箱,合约自动根据预言机喂价发放高流动性的 S…
- 军团 CTA 出勤打卡器 (Guild Attendance Tracker) - 一个特殊的智能网关或炮塔,当发起 “Call to Arms” 集结令时,它会自动记录并在链上给所有…
- 动态拥堵收费星门 (Dynamic Toll Stargate) - 算法根据过去 1 小时内星门的通过流量计算收费。如果在激战期间有大批舰队想走捷径,过路费会呈指数级上…
- 链上无头赏金所 (Bounty Hunter Escrow) - 全匿名的智能合约,任何人都可以将 SUI 锁入其中并指定一个角色的 ID。当系统捕捉到该角色的确切死…
- 全自动化兵工厂 (Automated Ammo Factory) - 利用 Move 的
Borrow-Use-Return模式,在无需人工干预的情况下,自动吃进矿… - 共享充电桩网络 (Shared Battery Network) - 部署超大型能量源组件(EnergySource),允许全宇宙任何飞船(无论阵营)停靠补能,但按照实际…
- 太空运单撮合市场 (Logistics Queue Manager) - 类似太空中的滴滴货运。发货人锁定押金和报酬,货柜车司机接单。只有当特定物品真实地存入了远在数光年外的…
- 一次性急救包贩卖机 (Emergency Medical Bay) - 高危星区的救命稻草。出售即用即毁的动态 NFT,飞船在濒死时触发此 NFT 的销毁交互,可以瞬间通过…
- 炮塔限时租赁协议 (Mercenary Firepower Renting) - 利用时间戳将特定杀伤性炮塔的
OwnerCap临时授权给一个非本联盟的矿工玩家。24小时后授权… - 太空海关扣款机 (Customs & Tax Stargate) - 专门部署在交通要道的星门,不按次收费,而是强制扣除过境船只钱包内所有 SUI 余额的 1% 作为“过…
- 联盟战利品均分系统 (Automated Loot Distributor) - 将战后打扫战场的储物箱链接到专门的分账合约。舰队指挥官只要把海量战利品抛入其中,智能合约会瞬间将其均…
- 死人开关:遗产继承器 (Dead-Man’s Switch Inheritor) - 长达 30 天没有任何链上签名的玩家会被判定为“脑死亡”。其名下的所有高级智能组件(储物箱、星门)的…
- 情报黑市付费墙 (Spy Network Paywall) - 侦察兵在敌对星系边缘捕捉到的舰队集结坐标。他们将坐标加密成一段 Dynamic Field 附加在特…
- 去中心化采矿订单板 (Alliance Job Board) - 军团建造泰坦需要大量 Veldspar 矿。直接发布锁定 10,000 SUI 的链上悬赏金,任何散…
- 异星资源期货交易所 (Fuel Futures Exchange) - 将还在地下、未来预计开采产出的特定类型燃料代币化(发行期货 Token)。玩家可以在开战前在二级市场…
- 闭关锁国网关 (Border Control API) - 绝对防御!该星门不仅仅校验白名单配置,更强制验证玩家地址内是否持有某知名链上身份认证凭证(例如:只允…
- 智能勒索炮塔 (Automated Ransomware Turret) - 锁定敌方飞船后不直接击毁,而是将其引擎功率锁定(利用特定的 De-buff)。除非对方在 5 分钟内…
- 全宇宙全民基本收入发生器 (UBI Generator) - 一个纯公益的 DAO 组织。部署一组永动机级别的能源阵列,每天搜集溢出的能源并在链上均匀地向全服所有…
- 零知识隐秘星区穿梭 (ZK Fleet Movements) - 运用原生 zkLogin / ZK Proof 密码学手段,舰队指挥官可以证明自己“合法支付了星门过…
- 跨链硬资产 EVE 映射桥 (Cross-Chain Asset Bridge) - 技术难度极高的跨链机制。允许以太坊巨鲸将其钱包里的 USDC 锁定,并在 EVE 的贸易站内自动 1…
- 亚毫秒级高频贸易站 (HFT DeepBook Trading Post) - 放弃传统的低效 AMM,直接将 Sui 官方的高性能中央限价订单簿(DeepBook)源码融入进 E…
- Sui GraphQL 全图热力追踪器 (GraphQL Live Heatmap) - 开发一个超强性能的链下数据聚合器。通过实时监听 Move 合约中抛出的
TurretFired(… - 战略武器多重签名发射井 (Multi-Sig Missile Silo) - 在 Sui 链上实现复杂的 N-of-M 多重签名机制。一枚能毁灭整个星区的超级实体导弹,想点火必须…
- 亿级对象池并发优化器 (Optimized Object Registry) - 针对超大战场中的“物品爆装”问题,重写底层的
ObjectRegistry。利用动态字段(Dyn… - 混合签名异步结算架构 (Off-Chain Sig_Verify) - 将绝大部分不涉及资产转移的高频交互(如走位、瞄准)通过链下服务器发放临时 Ed25519 签名,玩家…
- 动态无限船长日志 (Dynamic Field Metadata Engine) - 使用对象树形嵌套设计。将飞船所经历过的每一场著名战役的描述作为
Dynamic Field追加… - 海绵宝宝赞助钱包 (Gas-Sponge Dapp-Kit) - 极大优化新手体验(UX)。直接集成
@evefrontier/dapp-kit的赞助交易代码:… - 物理临近性地缘证明 (Proximity Proof Protocol) - 利用强大的密码学算法(如哈希时空碰撞),证明两个没有从属关系的玩家,此刻在现实(游戏底层引擎)的 3…
- 反肉鸡链上验证节点 (Decentralized CAPTCHA Node) - 在挂机刷矿泛滥的区域部署。该智能组件会随机抛出一个完全在 Move VM 内生成且验证的逻辑谜题,无…
- 基于 Move Prover 的绝对安全金库 (Move Prover Invariants) - 不仅仅是写业务代码,同时附带数百行的形式化验证(Formal Verification)断言集。从数…
- 时序锁定的舰队集体牵引跳跃 (Batched Jump Router) - 一个极为精妙的 TS 脚本应用,利用事务批处理技术,将军团内 100 艘舰船分别进行授权的过程打包入…
- 无缝热更新的包管理器 (Upgradable Package Manager) - 由于修改不可变资产风险极大,设计一种可插拔的模块化网关。未来需要迭代网关 AI 或收费逻辑时,可以将…
- ERC-20式的插件标准扩展协议 (Cross-Package Combinator) - 提出并实现一套针对 EVE 组件定制化的标准函数签名(如 `ActionModifier::exec…
- 硬派极客 EVE Vault 硬件插件 (Vault Keeper Ledger Edition) - 挑战高难度浏览器扩展开发!为目前基于 zkLogin 的 EVE 钱包加上一层强制性的 Ledger…
- 链下高速状态通道 (State Channel Skirmishes) - 两群玩家赌上身家性命进行狗斗,但由于链上 TPS 不够顺畅,他们将双方飞船资产总库隔离锁定,进入链下…
- 安全红队自动化黑盒机 (Automated Security CI/CD) - 一个服务于各路 Builder 的网页测试平台。能够一键上传自己刚写好的 EVE 防御网合约,平台会…
- 哨戒兵高敏监听 Discord Bot (Event-Driven Alert Bot) - 一款全天候悬浮于 WebSocket 连接上的机器人。当特定的系统组件抛出类似于 `UnderAtt…
- 完全去中心化的加密暗网指令 (On-Chain Encrypted Orders) - 舰队高层的命令全都是明文?利用 Diffie-Hellman 密钥交换原理与 EVE 角色的链上公钥…
- 可进化的拓麻歌子电子宠物 (Evolving Tamagotchi Drones) - 存在于星舰货舱里的一只非常脆弱的电子眼 NFT。玩家必须每天按时喂它特定的燃料渣滓,如果长达 7 天…
- 深空加密巨幅广告牌 (Space Billboard Protocol) - 如果你占领了全服最繁忙十字路口的一块太空巨石。你可以在这块石头上挂载基于链上 Display 标准的…
- 太空神权政治与神罚 (Sectarian Religion Framework) - 游戏内可以信奉不存在的“虚空之神”。土豪们向自己信仰的“战神”祭祀(存入)大把的 SUI 币;神殿的…
- 星海烈士纪念金卡 (Bounty Target NFT Cards) - 如果你单挑战胜了当前公认的顶尖全服霸主(例如 CEO)。在这个被摧毁的瞬间,系统会自动提取该霸主的最…
- 银河同步心跳电台 (Intergalactic Radio Station) - 一个简单的链上文字数组队列,玩家花极小代价就能在数组尾部追加一条 Spotify 的音乐链接。任何接…
- 星系大发现的终身冠名权 (Asteroid Naming Rights) - 当未知的神秘星区第一次被挖空探测器扫描完毕,那个打下最后一块原矿的玩家,会被赋予一次神圣的链上权限。…
- 零重力俄罗斯轮盘 (Cosmic Roulette) - 把生死交给真正的随机数!在一个完全隔离的赌台网关里,两名驾驶员签订生死状,并把各自心爱的史诗级旗舰所…
- 涂装乱入大师 (Cosmetic Modding Pipeline) - 类似早期的 Steam 社区创意工坊。极大地释放艺术表现,只要符合特定的 3D 模型面数规格限制,玩…
- AI 诗人传记生成器 (Generative Lore Library) - 每一座屹立于虚空中的超级跨星系跳跃星门的落成,都是由千万名蓝领劳工一点一滴搬砖建成的。这套组件会利用…
- 嗜血吸血鬼护盾 (Vampiric Weaponry) - 这不是用来彻底击没敌人的野蛮火炮。它射出的射线会在击中敌人装甲后,巧妙地截取非常精准的小数点后特定额…
- 去中心化联盟股份化 (Alliance Stock Market) - 联盟不再是一个组织形式,更是一个注册制上市公司。联盟的所有核心产能(每天的总采矿量流水)公开透明不可…
- 链上太空版密室逃脱 (On-Chain Escape Room) - 一种用技术极客心酸搭建的“连环画密码盒”。这里有一排按顺序放置好的储物箱,你必须通过解读上一个储物箱…
- 生存倒计时的星空死神契约 (Space Lotteries - Tontines) - 这是一场纯粹精神威压的残酷游戏,1 签定契约的 100 个人各自质押大量的起步金到奖池里;随着太空大…
- 全透明无金融性质的信誉系统 (Reputation Score Graph) - EVE 中尔虞我诈实在太多。为了防止小团队里有内鬼搞事,建立一套只计算社会交互维度的网络体系。如果你…
- 无声的战死者绝唱碑亭 (Monument to the Fallen) - 这不再是一个有用或者有经济效能的物件,这是一个废弃、不连接任何能量管线、毫无光泽的超大报废星门。但任…
- 高风险多签政治和亲条约 (Diplomatic Marriage Smart Contract) - 在两方巨大联盟对决的最焦灼时刻为了展示诚意。双方首脑把自己能打开所有公会最高密级物资库的核心权限卡(…
- 星长民主代议制 (Planetary Election System) - 一片庞大且富饶的太阳系由于矿石储量丰富吸引无数打工仔。为了治理这片混乱地带,大家决定投入高昂燃料进行…
- 席卷银河系的大海贼宝藏 (Galactic Treasure Hunt) - 全服超级彩蛋事件,开发者个人自掏腰包隐藏了一笔数目极其惊人的加密 SUI 代币于一个神秘的微小保险锁…
- 玩家主导型的去重式无限委托网 (Player-Generated Quests) - 抛开传统网游固定死的打十只野猪给两件白板衣服的套路。引入由所有闲散玩家自行定义的悬赏面板功能,你可以…
- 反向侦查雷达之暗夜走私网络 (Black Market Contraband) - 设定部分高能破坏原件具有很强的辐射效应被服务器自动打上了 “违禁品(Illegal)” 标签,因此在…
- Twitch 弹幕即时防空警报 (Turret Tourette’s) - 部署一台极度神经质的星门防卫重炮。其枪口朝向和开火的布尔值逻辑判断完全不看任何游戏参数,而是将其外接…
- “薛定谔的瑞克小卷”星门 (The “Rickroll” Gateway) - 披着无害外衣实则令人精神崩塌的整蛊组件。它确实有 99% 的绝对概率进行一次标准的星系曲率折叠,可如…
- 全太阳系公共土嗨广播站 (SUI-fueled Space Jukebox) - 这是一台具有极强全服广播扰属性的设施。任何过路者无论愿意与否,只要你停在这个太阳系,这个大音响储物间…
- 情感索取狂AI自爆飞镖 (The Sentient Bomb) - 被抛出来的并不是会物理追踪热能的火药。而是一个内嵌极速连线着 GPT-4 的多话痨炸包箱,你在面临这…
- 代驾代死:深空版Uber外包群 (Space Uber for Ships) - 由于路途漫长,很多土豪不想自己花生命按时赶路。这产生了一个极不负责任的分布式闲置打鱼飞船托管接力应用…
- 宇宙深度单机相亲插件 (Intergalactic Tinder) - 在冰冷无声且充满死亡的虚拟天体带里滑动匹配。本应用只允许你在方圆一百光年以内的区域搜寻那些同样加装了…
- 消耗型邪教——胖企鹅神风特攻队 (Kamikaze Drone Factory) - 极具嘲讽与烧钱快感的奢靡玩具工厂。只有你在链上真实并且永远烧毁掉诸如 Pudgy Penguins …
- 深空巨魔喷子:“凯伦”AI (The “Karen” AI Turret) - 当你驾船路过此地时,它既不索要买路钱也不放出激光。它唯一的判定机制是扫描你飞船的尾气排放引擎质量,只…
- 强行发钱的精神病收费站 (Reverse Toll Booth) - 完全不符合经济学常识甚至属于富婆撒币逻辑产物。某位退隐的币圈传奇巨贾注入大额资金建立的怪异星门:只要…
- 混沌真假暗箱魔盒 (Schrödinger’s Cargo Box) - 采用黑手党模式和深层的完全隐秘不透明封装函数手段设计的一个中转储物集装箱平台。你满心期待把一个价值连…
- 全银河致命热土豆极速传花接力赛 (Space Potato Relay Racing) - 将原生的不可消亡底层 Move “Hot Potato” 这种没有 Drop 能力的纯逻辑毒药,转化…
- 风水与命格算命算力仪 (Blockchain Fortune Teller) - 将东方神秘主义通过极客手段搬进了全息投影面板:你路过的某个废弃空间站台不仅仅是一个摆设,当你触碰之时…
- 反常理的圣母白莲花“治愈”治疗仪炮 (The Pacifist Laser) - 这种武器造出来就是为了纯属折磨人并且利用整蛊系统漏洞!发射这种名为光束武器实际则是海量垃圾垃圾冗余运…
- 涂鸦狂人的太空重金贴皮战 (On-Chain Vandalism Graffiti) - 这种黑客协议将矛头直指了任何本应该是庄重肃穆或者属于知名土豪玩家辛苦打造的宇宙标志性豪宅、顶级舰队旗…
- 让人迷失的极简深渊恐惧补丁视觉重构 (The Existential Dread Mod) - 这不是一个简单的扩展插件,它是从深层次心理维度打击那些习惯了无数报表堆砌以及金钱数字狂欢的高端玩家:…
- 缺角材质渲染残缺魔方的拜物邪教 (Cult of the Floating Default Cube) - 因某种游戏内引擎因为偶然一次贴图渲染出错产生的漂浮的完全虚空的灰白紫黑相间的 Default 模型方…
- 公共广播频率超大噪声文字转语音骚扰推车 (Sui-to-Speech Propaganda Network) - 所有只要途径这片归属于这个强大狂躁大联盟统治和看守之下的关隘或者是跳跃航道空间节点。你的船舱内部的无…
- 金字塔顶端庞氏骗局跨界星门 (The Pyramid Scheme Stargate) - 一个利用贪婪和精明伪装出的一种金融收割镰刀游戏系统大网。其实通过这个有着捷径和便利称呼的网关原本是要…
- 利用天文星象玄学走势控制期货市场的盘口系统软件 (Astrological Market manipulation) - 既然在这款全宇宙最充满变故与冰冷数据交织的庞大宇宙经济引擎中无法判断稀有元素在次日的暴跌暴涨。有个怪…
- 附带真正流血效果的残酷老虎机抽箱机 (Space Gacha with Real Punishments) - 对于抽奖文化走火入魔并且带有自毁性质倾向一种极致展现。你不仅在此刻投下了所有可能令你破产的身价积蓄在…
- Stillness 全天候无缝星际公路救援拖车联盟 (Stillness Automated Resupply - SAR) - 一群如同活雷锋一般潜伏于目前仍在测试火并阶段的各个角落。这是基于后台代码自动化时刻待命的一批幽灵护卫…
- 实时前线势力板块变动活点地图应用 (Live Territory Map Integrator) - 脱离单一维度的信息战!这是一套庞大并且能够如同监控中心般俯瞰现在真人在玩在干嘛的超宽屏宏伟态势实时图…
- Stillness 微观经济脉动指数超级彭博机终端 (Stillness Economy Dashboard) - 这里没有任何飞船轰炸这里只有关于利益输送和资本变动最赤裸裸和腥风血雨的数字流动记录中心!开发人员成功…
- 活体打劫海关自动化追缴讨债黑社会系统车队 (Automated Toll Collector Fleet) - 一群利用机器全天自动待命脚本组成并且在关键路口列阵而停的可怕黑道打手!这并不是游戏内预设那些很容易规…
- 现实同频跨界巨星全息太空音乐节实时检票系统门票发行大卖贩售处 (Live Event Ticketing) - 并不是开玩笑!在这款目前有大量高粘性测试人群活跃在服务器的宇宙之中正在确确切切真真正在举办着某个类似…
- 一键报警深空黑水国际安保快反部队紧急护航派发热线 Discord 中心 (Stillness Distress Signal Network) - 极度危险随时都被狙杀在无人工区运送物资矿工的极其必须品!他们开发了一套专门用于通过特定的快捷发报模块…
- 血流成河实时真金白银价值排行悬赏绞肉积分大榜 (On-Chain Killboard for Stillness) - 这是一个充斥满屏幕鲜红色并且全服所有人都极其关注的实时权威并且不可造假的顶尖刺客排名黑榜风云榜!这个…
- Stillness 内鬼监控财务高压线贪污资产全线熔断抓捕冻结器功能模块 (Live Alliance Treason Monitor) - 为了防患那些身居高位但是包场私通或者突然有一天被对家重金收买想要卷款而逃甚至要拉整个公会作为垫被直接…
- 多边形无差别黑心跨区域倒爷超级贸易套利嗅探器前瞻雷达终端 (Stillness Dynamic Supply/Demand Tracker) - 这里不讲究温情仅仅讲究着极致的差利润!这套界面满是高科技密密麻麻数据图不断通过极高的刷新频率全地图范…
- 直播室弹幕与实时对线游戏内实景神仙打斗金主打赏连麦绝杀暗杀指令下注网络 (Live Streamer Bounty Overlay) - 这个模块系统完美模糊了一切什么是场外观众干预什么更是身处于局中当局者的极限打破边界四面墙大作:如果这…
- 代码写手随写随投火线生死时速一键热战区即走即用 CI/CD 直送大炮平台投射器 (Continuous Integration Deployer) - 那些极具疯狂热爱在这个完全由代码决定生死战局的顶尖技术极客前线黑心商人们和防御专家们他们急需的一种极…
- 大过滤器与宇宙审判长不可销毁永远挂榜游街活体历史耻辱柱罪证录档案馆 (Stillness Diplomatic Incident Logger) - 这是没有偏袒也不存在各种公关或者口水战中双方相互推诿并且泼脏水的混乱!这个大记事本终端极其冷冰并且绝…
- 无人机蜂群大军绝对服从冷酷集权控制同步采矿调度全境指挥旗舰模块 (Automated Live Mining Fleet Manager) - 抛弃一切闲散而且总是出错以及摸鱼甚至因为没注意去上厕所导致错失了好几块珍稀大原矿的散漫人工指挥部!这…
- 网络大崩溃与蓝屏宕机自然灾害巨额对冲理赔不讲理保险天灾保障兜底机构 (Live Server Status Oracle) - 游戏总是非常残酷的而且对于还仍然是个巨大草台班子的而且随时可能会因为挤掉线并且各种不完善导致的大炸服…
- 真实活体深空华尔街之狼交易不打烊巨型现期货深蓝星门交割大盘枢纽重置中心 (The Stillness “Stock Exchange”) - 你所以为的买卖还是两艘船小心翼翼如同见不得光的毒贩在某个角落进行微不足道的小数目接触交换防骗的那种极…
- 星系大航海房产黑中介与超级学区房不讲理二手囤积地皮拍卖行应用大炒家平台 (In-game Real Estate Brokerage) - 不要觉得在这个如此广阔并没有边界甚至完全真空连一块落脚的泥土都非常奢侈以及随时都在漂移宇宙中就没有地…
- 星际盖世太保大路条只认衣服不认人无差别人种全场面大清查拦截霸道大安检巨炮网关卡 (Live Contraband Scanner Hub) - 在这场充斥着极度自由并且毫无任何道德规矩或者是底线的极度疯狂甚至什么情况都发生的世界和时间大背景下当…
- 星空荒野大流浪与极度无助遇难荒岛幸存者极限极速极地救援并且能吃热饭热汤极暖心慈善红十字大飞机大公益抢险基金拨款援助总群 DAO 基地 (Stillness Rescue DAO) - 和上面那种随时让你骨头都不剩下和极端恶毒的大炮关口不同!世界上终有善良而且极具大爱充满并且在这个冷冰…
- 深空巨头垄断不眠不休大托拉斯大资本垄断无情超级做市收割巨鳄机器人黑心机器模块全自动割韭菜提款机 (Live Market Maker Bot - MMB) - 你不要极其天真并且极其单纯地去以为在这个充斥这极其无边无际甚至广袤并且完全没有任何约束缺乏且那些各种…
- 全宇宙极度悲惨死人谷永远安息极度恐怖阴风阵阵甚至极其诡异幽静完全荒凉甚至如同禁地一般不仅极其极其巨大甚至庞大无边且死气沉沉完全彻底没有一丝生机大墓碑排行榜大超级甚至带有悼词完全无人敢踏足极大并且绝望超大死人谷活体大纪念坑中心墓地遗迹 (The “Stillness Memorial”) - 在这样一个在这个无时无刻充斥每天随时可能发生极其惨烈并且到处充满了爆炸并且有着各种勾心斗角为了利益连…