GitBook: [2.0.0] 42 pages and 6 assets modified

This commit is contained in:
boneyard93501 2021-07-05 00:18:51 +00:00 committed by gitbook-bot
parent a8219a87b2
commit 2066a70608
No known key found for this signature in database
GPG Key ID: 07D2180C7B12D0FF
16 changed files with 802 additions and 29 deletions

View File

@ -4,35 +4,18 @@
* [Thinking In Aquamarine](p2p.md)
* [Concepts](concepts.md)
* [Quick Start](quick-start.md)
* [Quick Start](quick_start/README.md)
* [Setup](quick_start/quick_start_setup.md)
* [Using a Service](quick_start/quick_start_using_a_service.md)
* [Building An Application From Multiple Services](quick_start/quick_start_building_from_multiple_services.md)
* [Adding A Storage Service](quick_start/quick_start_add_persistence/README.md)
* [Setting Up](quick_start/quick_start_add_persistence/quick_start_persistence_setup.md)
* [CRUD All the Way](quick_start/quick_start_add_persistence/quick_start_persistence_crud.md)
* [What's Next](quick_start/quick_start_summary.md)
* [Developing Modules And Services](development_development/README.md)
* [Overview](development_development/development_overview.md)
* [From Module To Service](development_development/developmet_build_modules.md)
* [Building The Reward Block Application](development_development/development_reward_block_app/README.md)
* [Ethereum Request Service](development_development/development_reward_block_app/development_eth_calls.md)
* [SQLite Service](development_development/development_reward_block_app/development_sqlite.md)
* [Blocks To Database](development_development/development_reward_block_app/development_persisting_blocks.md)
* [Additional Concepts](development_development/development_reward_block_app/development_additional_concepts.md)
* [Summary](development_development/summary.md)
* [Aquamarine](knowledge_aquamarine/README.md)
* [Aqua](knowledge_aquamarine/hll/README.md)
* [Aqua VM](knowledge_aquamarine/hll/vm.md)
* [AIR](knowledge_aquamarine/hll/knowledge_aquamarine_air.md)
* [Marine](knowledge_aquamarine/marine/README.md)
* [Marine CLI](knowledge_aquamarine/marine/marine-cli.md)
* [Marine Repl](knowledge_aquamarine/marine/marine-repl.md)
* [Marine Rust SDK](knowledge_aquamarine/marine/marine-rs-sdk.md)
* [Tools](knowledge_tools.md)
* [Knowledgebase](knowledge_knowledge/README.md)
* [Overview](knowledge_knowledge/knowledge_overview.md)
* [Concepts](knowledge_knowledge/knowledge_concepts.md)
* [Tools](knowledge_knowledge/knowledge_tools.md)
* [Aquamarine](knowledge_knowledge/knowledge_aquamarine/README.md)
* [Aqua](knowledge_knowledge/knowledge_aquamarine/hll/README.md)
* [Aqua VM](knowledge_knowledge/knowledge_aquamarine/hll/vm.md)
* [AIR](knowledge_knowledge/knowledge_aquamarine/hll/knowledge_aquamarine_air.md)
* [Marine](knowledge_knowledge/knowledge_aquamarine/marine/README.md)
* [Marine CLI](knowledge_knowledge/knowledge_aquamarine/marine/marine-cli.md)
* [Marine Repl](knowledge_knowledge/knowledge_aquamarine/marine/marine-repl.md)
* [Marine Rust SDK](knowledge_knowledge/knowledge_aquamarine/marine/marine-rs-sdk.md)
* [Node](knowledge_knowledge/node/README.md)
* [Overview](knowledge_knowledge/node/overview.md)
* [Services](knowledge_knowledge/node/knowledge_node_services.md)

View File

@ -127,6 +127,10 @@ The Fluence protocol offers an alternative to node selection, i.e. connection an
[TrustGraph](https://github.com/fluencelabs/trust-graph) is currently under active development. Please check the repo for progress.
{% endhint %}
### Application
An application is the "frontend" to one or more services and their execution sequence. Applications are developed by coordinating one or more services into a logical compute unit and tend to live outside the Fluence network**,** e.g., the browser as a peer-client. They can be executed in various runtime environments ranging from browsers to backend daemons.
### **Scaling Applications**
As discussed previously, decoupling at the network and business logic levels is at the core of the Fluence protocol and provides the major entry points for scaling solutions.

View File

@ -0,0 +1,22 @@
# Aquamarine
Aquamarine is a programming language and executable choreography tool for distributed applications and backends. Aquamarine manages the communication and coordination between services, devices, and APIs without introducing any centralized gateway and can be used to express various distributed systems: from simple request-response to comprehensive network consensus algorithms.
At the core of Aquamarine is the design ideal and idea to pair concurrent systems, and especially decentralized networks, with a programing and execution tool chain to avoid centralized bottlenecks commonly introduced with [workflow engines](https://en.wikipedia.org/wiki/Workflow_engine) and [Business rule engines](https://en.wikipedia.org/wiki/Business_rules_engine). This not only makes Aquamarine the rosetta stone of the Fluence solution but also a very powerful generic coordination and composition medium.
## Background
When we build systems, we need to be able to model, specify, analyze and verify them and this is especially important to concurrent systems such as parallel and multi-threaded systems. [Formal specification](https://en.wikipedia.org/wiki/Formal_specification) are a family of formal approaches to design, model, and verify system. In the context of concurrent systems, there are two distinct formal specification techniques available. The state oriented approach is concerned with modeling verifying a systems state and state transitions and is often accomplished with [TLA+](https://en.wikipedia.org/wiki/TLA%2B). Modern blockchain design, modeling, and verification tend to rely on a state-based specification.
An alternative, complementary approach is based on [Process calculus](https://en.wikipedia.org/wiki/Process_calculus) to model and verify the sequence of communications operations of a system at any given time. [π-Calculs](https://en.wikipedia.org/wiki/%CE%A0-calculus) is a modern process calculus employed in a wide range of applications ranging from biology to games and business processes.
Aquamarine, Fluence's distributed composition language and runtime, is based on π-calculus and provides a solid theoretical basis toward the design, modeling, implementation, and verification of a wide class of distributed, peer-to-peer networks, applications and backends.
## Language
[Aquamarine Intermediate Representation](https://github.com/boneyard93501/docs/tree/a512080f81137fb575a5b96d3f3e83fa3044fd1c/src/knowledge-base/knowledge_aquamarine__air.md) \(AIR\) is a low-level language modeled after the [WebAssembly text format](https://developer.mozilla.org/en-US/docs/WebAssembly/Understanding_the_text_format) and allows developers to manage network peers as well as services and backends. AIR, while intended as a compile target, is currently the only Aquamarine language implementation although a high level language \(HLL\) is currently under active development.
## Runtime
The Aquamarine runtime is a virtual machine executed by the [Fluence Compute Engine](https://github.com/boneyard93501/docs/tree/a512080f81137fb575a5b96d3f3e83fa3044fd1c/src/knowledge-base/knowledge_fce.md) \(FCE\) running not only on every Fluence network peer but also on every frontend client. The distributed runtime availability per node not only aids in decentralized service discovery and execution at the same level of decentralization of the network, which is of significant importance. Moreover, with running execution scripts on both the client and the \(remote\) nodes, a high degree of auditability and verifiability can be attained.

View File

@ -0,0 +1,6 @@
# Aqua
## Aquamarine High Level Language
_**Stay Tuned -- Coming Soon To A Repo Near You**_

View File

@ -0,0 +1,55 @@
# AIR
The Aquamarine Intermediate Representation \(AIR\) is a low level language to program both distributed networks and the services deployed on them. The language is comprised of a small number of instructions:
* _**call**_: : execution
* _**seq**_ : sequential
* _**par** :_ parallel
* _**fold**_ : iteration
* _**xor** :_ branching & error handling
* _**null**_ : empty instruction
which operate on _peer-id_ \(location\), _service-id_, and _service method_ over an argument list, see Figure 1.
**Figure 1: AIR Instruction Definition** ![Execution](../../.gitbook/assets/air_call_execution_1.png)
## Instructions
AIR instructions are intended to launch the execution of a service method as follows:
1. The method is executed on the peer specified by the peer id \(location\) parameter
2. The peer is expected to have the Wasm service specified by the service id parameter
3. The service must have a callable method specified be the method parameter
4. The arguments specified by the argument list are passed to the method
5. The result of the method returned under the name output name
**Figure 2: Sequential Instruction** ![Execution](../../.gitbook/assets/air_sequential_2%20%281%29%20%281%29%20%281%29%20%281%29%20%281%29%20%282%29%20%283%29%20%284%29%20%284%29%20%284%29%20%281%29.png)
The _**seq**_ instruction takes two instructions at most as its arguments and executes them sequentially, one after the other.
**Figure 3: Parallel Instruction** ![Execution](../../.gitbook/assets/air_par_3.png)
The _**par**_ instruction takes two instructions at most as its arguments and particles may execute on parallel paths iff each service referenced is hosted on a different node otherwise particles execute sequentially
TODO: add better graphic showing the disticntion of branching vs seq.
**Figure 4: Fold Instruction** ![Execution](https://github.com/fluencelabs/gitbook-docs/tree/84e814d02d9299034c9c031adf7f081bb59898b9/.gitbook/assets/air_fold_4%20%281%29%20%282%29%20%281%29.png)
The _**fold**_ instruction iterates over the elements of an array and workds as follows:
* _**fold**_ instruction takes three arguments: an array, a variable and an instruction
* At each iteration, the variable is assigned an element of the array and the argument-instruction is executed
* The argument-instruction can access the variable and uses the next statement to trigger the next iteration
Figure 5: Branching Instruction ![Execution](../../.gitbook/assets/air_xor_5.png)
This instruction is intended for organizing branches in the flow of execution as well as for handling errors:
* The _**XOR**_ instruction takes two instructions as its arguments
* The first instruction is executed and if the execution is successful, then the second instruction is ignored
* If the first instruction fails, then the second one is executed.
**Figure 6: Null Instruction** ![Execution](https://github.com/fluencelabs/gitbook-docs/tree/84e814d02d9299034c9c031adf7f081bb59898b9/.gitbook/assets/air_null_6%20%281%29%20%282%29.png)
This is an empty instruction: it takes no arguments and does nothing. The _**null**_ instruction is useful for generating code.

View File

@ -0,0 +1,2 @@
# Aqua VM

View File

@ -0,0 +1,16 @@
# Marine
[Marine](https://github.com/fluencelabs/marine) is a general purpose WebAssembly runtime favoring Wasm modules based on the [ECS](https://en.wikipedia.org/wiki/Entity_component_system) pattern or plugin architecture and uses Wasm [Interface Types](https://github.com/WebAssembly/interface-types/blob/master/proposals/interface-types/Explainer.mdhttps://github.com/WebAssembly/interface-types/blob/master/proposals/interface-types/Explainer.md) \( IT\) to implement a [shared-nothing](https://en.wikipedia.org/wiki/Shared-nothing_architecture) linking scheme. Fluence [nodes](https://github.com/fluencelabs/fluence) use Marine to host the Aqua VM and execute hosted Wasm services.
Todo: we could really do with diagram
The [Marine Rust SDK](https://github.com/fluencelabs/marine-rs-sdk) allows to hide the IT implementation details behind a handy procedural macro `[marine]` and provides the scaffolding for unit tests.

View File

@ -0,0 +1,35 @@
# Marine CLI
The [Marine command line tool](https://github.com/fluencelabs/marine) provides the project `marine build` functionality, analogous to `cargo build`, that results in the Rust code to be compiled to _wasm32-wasi_ modules. In addition, `marine` provides utilities to inspect Wasm modules, expose Wasm module attributes or manually set module properties.
```rust
mbp16~(:|✔) % marine --help
Fluence Marine command line tool 0.6.7
Fluence Labs
USAGE:
marine [SUBCOMMAND]
FLAGS:
-h, --help Prints help information
-V, --version Prints version information
SUBCOMMANDS:
aqua Shows data types of provided module in a format suitable for Aqua
build Builds provided Rust project to Wasm
help Prints this message or the help of the given subcommand(s)
info Shows manifest and sdk version of the provided Wasm file
it Shows IT of the provided Wasm file
repl Starts Fluence application service REPL
set Sets interface types and version to the provided Wasm file
mbp16~(:|✔) %
```

View File

@ -0,0 +1,35 @@
# Marine Repl
[`mrepl`](https://crates.io/crates/mrepl) is a command line tool to locally run a Marine instance to inspect, run, and test Wasm modules and service configurations. We can run the Repl either with `mrepl` or `marine repl`
```text
mbp16~(:|✔) % mrepl
Welcome to the Marine REPL (version 0.7.2)
Minimal supported versions
sdk: 0.6.0
interface-types: 0.20.0
New version is available! 0.7.2 -> 0.7.4
To update run: cargo +nightly install mrepl --force
app service was created with service id = d81a4de5-55c3-4cb7-935c-3d5c6851320d
elapsed time 486.234µs
1> help
Commands:
n/new [config_path] create a new service (current will be removed)
l/load <module_name> <module_path> load a new Wasm module
u/unload <module_name> unload a Wasm module
c/call <module_name> <func_name> [args] call function with given name from given module
i/interface print public interface of all loaded modules
e/envs <module_name> print environment variables of a module
f/fs <module_name> print filesystem state of a module
h/help print this message
q/quit/Ctrl-C exit
2>
```

View File

@ -0,0 +1,537 @@
# Marine Rust SDK
The [marine-rs-sdk](https://github.com/fluencelabs/marine-rs-sdk) empowers developers to write services suitable for peer hosting in peer-to-peer networks using the Marine Virtual Machine by enabling the wasm32-wasi compile target for Marine. For an introduction to writing services with the marine-rs-sdk, see the [Developing Modules And Services]() section.
### API
The procedural macros `[marine]` and `[marine_test]` are the two primary features provided by the SDK. The `[marine]` macro can be applied to a function, external block or structure. The `[marine_test]` macro, on the other hand, allows the use of the familiar `cargo test` to execute tests over the actual Wasm module generated from the service code.
#### Function Export
Applying the `[marine]` macro to a function results in its export, which means that it can be called from other modules or AIR scripts. For the function to be compatible with this macro, its arguments must be of the `ftype`, which is defined as follows:
`ftype` = `bool`, `u8`, `u16`, `u32`, `u64`, `i8`, `i16`, `i32`, `i64`, `f32`, `f64`, `String`
`ftype` = `ftype` \| `Vec`&lt;`ftype`&gt;
`ftype` = `ftype` \| `Record`&lt;`ftype`&gt;
In other words, the arguments must be one of the types listed below:
* one of the following Rust basic types: `bool`, `u8`, `u16`, `u32`, `u64`, `i8`, `i16`, `i32`, `i64`, `f32`, `f64`, `String`
* a vector of elements of the above types
* a vector composed of vectors of the above type, where recursion is acceptable, e.g. the type `Vec<Vec<Vec<u8>>>` is permissible
* a record, where all fields are of the basic Rust types
* a record, where all fields are of any above types or other records
The return type of a function must follow the same rules, but currently only one return type is possible.
See the example below of an exposed function with a complex type signature and return value:
```rust
// export TestRecord as a public data structure bound by
// the IT type constraints
#[marine]
pub struct TestRecord {
pub field_0: i32,
pub field_1: Vec<Vec<u8>>,
}
// export foo as a public function bound by the
// IT type contraints
#[marine] #
pub fn foo(arg_1: Vec<Vec<Vec<Vec<TestRecord>>>>, arg_2: String) -> Vec<Vec<Vec<Vec<TestRecord>>>> {
unimplemented!()
}
```
{% hint style="info" %}
Function Export Requirements
* wrap a target function with the `[marine]` macro
* function arguments must by of `ftype`
* the function return type also must be of `ftype`
{% endhint %}
#### Function Import
The `[marine]` macro can also wrap an [`extern` block](https://doc.rust-lang.org/std/keyword.extern.html). In this case, all functions declared in it are considered imported functions. If there are imported functions in some module, say, module A, then:
* There should be another module, module B, that exports the same functions. The name of module B is indicated in the `link` macro \(see examples below\).
* Module B should be loaded to `Marine` by the moment the loading of module A starts. Module A cannot be loaded if at least one imported function is absent in `Marine`.
See the examples below for wrapped `extern` block usage:
{% tabs %}
{% tab title="Example 1" %}
```rust
#[marine]
pub struct TestRecord {
pub field_0: i32,
pub field_1: Vec<Vec<u8>>,
}
// wrap the extern block with the marine macro to expose the function
// as an import to the Marine VM
#[marine]
#[link(wasm_import_module = "some_module")]
extern "C" {
pub fn foo(arg: Vec<Vec<Vec<Vec<TestRecord>>>>, arg_2: String) -> Vec<Vec<Vec<Vec<TestRecord>>>>;
}
```
{% endtab %}
{% tab title="Example 2" %}
```rust
[marine]
#[link(wasm_import_module = "some_module")]
extern "C" {
pub fn foo(arg: Vec<Vec<Vec<Vec<u8>>>>) -> Vec<Vec<Vec<Vec<u8>>>>;
}
```
{% endtab %}
{% endtabs %}
{% hint style="info" %}
#### Function import requirements
* wrap an extern block with the function\(s\) to be imported with the `[marine]` macro
* all function\(s\) arguments must be of the `ftype` type
* the return type of the function\(s\) must be `ftype`
{% endhint %}
####
#### Structures
Finally, the `[marine]` macro can wrap a `struct` making possible to use it as a function argument or return type. Note that
* only macro-wrapped structures can be used as function arguments and return types
* all fields of the wrapped structure must be public and of the `ftype`.
* it is possible to have inner records in the macro-wrapped structure and to import wrapped structs from other crates
See the example below for wrapping `struct`:
{% tabs %}
{% tab title="Example 1" %}
```rust
#[marine]
pub struct TestRecord0 {
pub field_0: i32,
}
#[marine]
pub struct TestRecord1 {
pub field_0: i32,
pub field_1: String,
pub field_2: Vec<u8>,
pub test_record_0: TestRecord0,
}
#[marine]
pub struct TestRecord2 {
pub test_record_0: TestRecord0,
pub test_record_1: TestRecord1,
}
#[marine]
fn foo(mut test_record: TestRecord2) -> TestRecord2 { unimplemented!(); }
```
{% endtab %}
{% tab title="Example 2" %}
```rust
#[fce]
pub struct TestRecord0 {
pub field_0: i32,
}
#[fce]
pub struct TestRecord1 {
pub field_0: i32,
pub field_1: String,
pub field_2: Vec<u8>,
pub test_record_0: TestRecord0,
}
#[fce]
pub struct TestRecord2 {
pub test_record_0: TestRecord0,
pub test_record_1: TestRecord1,
}
#[fce]
#[link(wasm_import_module = "some_module")]
extern "C" {
fn foo(mut test_record: TestRecord2) -> TestRecord2;
}
```
{% endtab %}
{% tab title="Example 3" %}
```rust
mod data_crate {
use fluence::marine;
#[marine]
pub struct Data {
pub name: String,
pub data: f64,
}
}
use data_crate::Data;
use fluence::marine;
fn main() {}
#[marine]
fn some_function() -> Data {
Data {
name: "example".into(),
data: 1.0,
}
}
```
{% endtab %}
{% endtabs %}
{% hint style="info" %}
> #### Structure passing requirements
>
> * wrap a structure with the `[marine]` macro
> * all structure fields must be of the `ftype`
> * the structure must be pointed to without preceding package import in a function signature, i.e`StructureName` but not `package_name::module_name::StructureName`
> * wrapped structs can be imported from crates
{% endhint %}
####
#### Call Parameters
There is a special API function `fluence::get_call_parameters()` that returns an instance of the [`CallParameters`](https://github.com/fluencelabs/marine-rs-sdk/blob/master/fluence/src/call_parameters.rs#L35) structure defined as follows:
```rust
pub struct CallParameters {
/// Peer id of the AIR script initiator.
pub init_peer_id: String,
/// Id of the current service.
pub service_id: String,
/// Id of the service creator.
pub service_creator_peer_id: String,
/// Id of the host which run this service.
pub host_id: String,
/// Id of the particle which execution resulted a call this service.
pub particle_id: String,
/// Security tetraplets which described origin of the arguments.
pub tetraplets: Vec<Vec<SecurityTetraplet>>,
}
```
CallParameters are especially useful in constructing authentication services:
```text
// auth.rs
use fluence::{marine, CallParameters};
use::marine;
pub fn is_owner() -> bool {
let meta = marine::get_call_parameters();
let caller = meta.init_peer_id;
let owner = meta.service_creator_peer_id;
caller == owner
}
#[marine]
pub fn am_i_owner() -> bool {
is_owner()
}
```
####
#### MountedBinaryResult
Due to the inherent limitations of Wasm modules, such as a lack of sockets, it may be necessary for a module to interact with its host to bridge such gaps, e.g. use a https transport provider like _curl_. In order for a Wasm module to use a host's _curl_ capabilities, we need to provide access to the binary, which at the code level is achieved through the Rust `extern` block:
```rust
// Importing a linked binary, curl, to a Wasm module
#![allow(improper_ctypes)]
use fluence::marine;
use fluence::module_manifest;
use fluence::MountedBinaryResult;
module_manifest!();
pub fn main() {}
#[marine]
pub fn curl_request(curl_cmd: Vec<String>) -> MountedBinaryResult {
let response = curl(curl_cmd);
response
}
#[marine]
#[link(wasm_import_module = "host")]
extern "C" {
fn curl(cmd: Vec<String>) -> MountedBinaryResult;
}
```
The above code creates a "curl adapter", i.e., a Wasm module that allows other Wasm modules to use the the `curl_request` function, which calls the imported _curl_ binary in this case, to make http calls. Please note that we are wrapping the `extern` block with the `[marine]`macro and introduce a Marine-native data structure [`MountedBinaryResult`](https://github.com/fluencelabs/marine/blob/master/examples/url-downloader/curl_adapter/src/main.rs) as the linked-function return value.
Please not that if you want to use `curl_request` with testing, see below, the curl call needs to be marked unsafe, e.g.:
```rust
let response = unsafe { curl(curl_cmd) };
```
since cargo does not have access to the magic in place in the marine rs sdk to handle unsafe.
MountedBinaryResult itself is a Marine-compatible struct containing a binary's return process code, error string and stdout and stderr as byte arrays:
```rust
#[marine]
#[derive(Clone, PartialEq, Default, Eq, Debug, Serialize, Deserialize)]
pub struct MountedBinaryResult {
/// Return process exit code or host execution error code, where SUCCESS_CODE means success.
pub ret_code: i32,
/// Contains the string representation of an error, if ret_code != SUCCESS_CODE.
pub error: String,
/// The data that the process wrote to stdout.
pub stdout: Vec<u8>,
/// The data that the process wrote to stderr.
pub stderr: Vec<u8>,
}
```
MountedBinaryResult then can be used on a variety of match or conditional tests.
#### Testing
Since we are compiling to a wasm32-wasi target with `ftype` constrains, the basic `cargo test` is not all that useful or even usable for our purposes. To alleviate that limitation, Fluence has introduced the [`[marine-test]` macro ](https://github.com/fluencelabs/marine-rs-sdk/tree/master/crates/marine-test-macro)that does a lot of the heavy lifting to allow developers to use `cargo test` as intended. That is, `[marine-test]` macro generates the necessary code to call Marine, one instance per test function, based on the Wasm module and associated configuration file so that the actual test function is run against the Wasm module not the native code.
Let's have a look at an implementation example:
```rust
use fluence::marine;
use fluence::module_manifest;
module_manifest!();
pub fn main() {}
#[marine]
pub fn greeting(name: String) -> String { # 1
format!("Hi, {}", name)
}
#[cfg(test)]
mod tests {
use fluence_test::marine_test; # 2
#[marine_test(config_path = "../Config.toml", modules_dir = "../artifacts")] # 3
fn empty_string() {
let actual = greeting.greeting(String::new()); # 4
assert_eq!(actual, "Hi, ");
}
#[marine_test(config_path = "../Config.toml", modules_dir = "../artifacts")]
fn non_empty_string() {
let actual = greeting.greeting("name".to_string());
assert_eq!(actual, "Hi, name");
}
}
```
1. We wrap a basic _greeting_ function with the `[marine`\] macro which results in the greeting.wasm module
2. We wrap our tests as usual with `[cfg(test)]` and import the fluence_test crate._ Do **not** import _super_ or the _local crate_.
3. Instead, we apply the `[marine_test]` to each of the test functions by providing the path to the config file, e.g., Config.toml, and the directory containing the Wasm module we obtained after compiling our project with `marine build`. It is imperative that project compilation proceeds the test runner otherwise there won't be the required Wasm file.
4. The target of our tests is the `pub fn greeting` function. Since we are calling the function from the Wasm module we must prefix the function name with the module namespace -- `greeting` in this example case.
Now that we have our Wasm module and tests in place, we can proceed with `cargo test --release.` Note that using the `release`vastly improves the import speed of the necessary Wasm modules.
### Features
The SDK has two useful features: `logger` and `debug`.
#### Logger
Using logging is a simple way to assist in debugging without deploying the module\(s\) to a peer-to-peer network node. The `logger` feature allows you to use a special logger that is based at the top of the [log](https://crates.io/crates/log) crate.
To enable logging please specify the `logger` feature of the Fluence SDK in `Config.toml` and add the [log](https://docs.rs/log/0.4.11/log/) crate:
```rust
[dependencies]
log = "0.4.14"
fluence = { version = "0.6.9", features = ["logger"] }
```
The logger should be initialized before its usage. This can be done in the `main` function as shown in the example below.
```rust
use fluence::marine;
use fluence::WasmLogger;
pub fn main() {
WasmLogger::new()
// with_log_level can be skipped,
// logger will be initialized with Info level in this case.
.with_log_level(log::Level::Info)
.build()
.unwrap();
}
#[marine]
pub fn put(name: String, file_content: Vec<u8>) -> String {
log::info!("put called with file name {}", file_name);
unimplemented!()
}
```
In addition to the standard log creation features, the Fluence logger allows the so-called target map to be configured during the initialization step. This allows you to filter out logs by `logging_mask`, which can be set for each module in the service configuration. Let's consider an example:
```rust
const TARGET_MAP: [(&str, i64); 4] = [
("instruction", 1 << 1),
("data_cache", 1 << 2),
("next_peer_pks", 1 << 3),
("subtree_complete", 1 << 4),
];
pub fn main() {
use std::collections::HashMap;
use std::iter::FromIterator;
let target_map = HashMap::from_iter(TARGET_MAP.iter().cloned());
fluence::WasmLogger::new()
.with_target_map(target_map)
.build()
.unwrap();
}
#[marine]
pub fn foo() {
log::info!(target: "instruction", "this will print if (logging_mask & 1) != 0");
log::info!(target: "data_cache", "this will print if (logging_mask & 2) != 0");
}
```
Here, an array called `TARGET_MAP` is defined and provided to a logger in the `main` function of a module. Each entry of this array contains a string \(a target\) and a number that represents the bit position in the 64-bit mask `logging_mask`. When you write a log message request `log::info!`, its target must coincide with one of the strings \(the targets\) defined in the `TARGET_MAP` array. The log will be printed if `logging_mask` for the module has the corresponding target bit set.
{% hint style="info" %}
REPL also uses the log crate to print logs from Wasm modules. Log messages will be printed if`RUST_LOG` environment variable is specified.
{% endhint %}
#### Debug
The application of the second feature is limited to obtaining some of the internal details of the IT execution. Normally, this feature should not be used by a backend developer. Here you can see example of such details for the greeting service compiled with the `debug` feature:
```bash
# running the greeting service compiled with debug feature
~ $ RUST_LOG="info" fce-repl Config.toml
Welcome to the Fluence FaaS REPL
app service's created with service id = e5cfa463-ff50-4996-98d8-4eced5ac5bb9
elapsed time 40.694769ms
1> call greeting greeting "user"
[greeting] sdk.allocate: 4
[greeting] sdk.set_result_ptr: 1114240
[greeting] sdk.set_result_size: 8
[greeting] sdk.get_result_ptr, returns 1114240
[greeting] sdk.get_result_size, returns 8
[greeting] sdk.get_result_ptr, returns 1114240
[greeting] sdk.get_result_size, returns 8
[greeting] sdk.deallocate: 0x110080 8
result: String("Hi, user")
elapsed time: 222.675µs
```
The most important information these logs relates to the `allocate`/`deallocate` function calls. The `sdk.allocate: 4` line corresponds to passing the 4-byte `user` string to the Wasm module, with the memory allocated inside the module and the string is copied there. Whereas `sdk.deallocate: 0x110080 8` refers to passing the 8-byte resulting string `Hi, user` to the host side. Since all arguments and results are passed by value, `deallocate` is called to delete unnecessary memory inside the Wasm module.
#### Module Manifest
The `module_manifest!` macro embeds the Interface Type \(IT\), SDK and Rust project version as well as additional project and build information into Wasm module. For the macro to be usable, it needs to be imported and initialized in the _main.rs_ file:
```text
// main.rs
use fluence::marine;
use fluence::module_manifest; // import manifest macro
module_manifest!(); // initialize macro
fn main() {}
#[marine]
fn some_function() {}
}
```
Using the Marine CLI, we can inspect a module's manifest with `marine info`:
```rust
mbp16~/localdev/struct-exp(main|…) % marine info -i artifacts/*.wasm
it version: 0.20.1
sdk version: 0.6.0
authors: The Fluence Team
version: 0.1.0
description: foo-wasm, a Marine wasi module
repository:
build time: 2021-06-11 21:08:59.855352 +00:00 UTC
```

View File

@ -12,7 +12,7 @@ Each Fluence peer is equipped with a set of "built-in" services that can be call
6. _op_ basic operations on data deprecated - namespace for deprecated API Below is the reference documentation for all the existing built-in services. Please refer to the JS SDK documentation to learn how to easily use them from the JS SDK
7. _deprecated_ - namespace for deprecated API
Please note that the [`fldist`](../knowledge_tools.md#fluence-proto-distributor-fldist) CLI tool, as well as the [JS SDK](../knowledge_tools.md#fluence-js-sdk), provide access to node-based services.
Please note that the [`fldist`](../../knowledge_tools.md#fluence-proto-distributor-fldist) CLI tool, as well as the [JS SDK](../../knowledge_tools.md#fluence-js-sdk), provide access to node-based services.
## API

78
knowledge_tools.md Normal file
View File

@ -0,0 +1,78 @@
# Tools
## Fluence Marine REPL
[`mrepl`](https://crates.io/crates/mrepl) is a command line tool \(CLI\) to locally run a Marine instance to inspect, run, and test module and service configurations.
```text
mbp16~(:|✔) % mrepl
Welcome to the Marine REPL (version 0.7.2)
Minimal supported versions
sdk: 0.6.0
interface-types: 0.20.0
New version is available! 0.7.2 -> 0.7.4
To update run: cargo +nightly install mrepl --force
app service was created with service id = d81a4de5-55c3-4cb7-935c-3d5c6851320d
elapsed time 486.234µs
1> help
Commands:
n/new [config_path] create a new service (current will be removed)
l/load <module_name> <module_path> load a new Wasm module
u/unload <module_name> unload a Wasm module
c/call <module_name> <func_name> [args] call function with given name from given module
i/interface print public interface of all loaded modules
e/envs <module_name> print environment variables of a module
f/fs <module_name> print filesystem state of a module
h/help print this message
q/quit/Ctrl-C exit
2>
```
## Fluence Proto Distributor: FLDIST
\`\`[`fldist`](https://github.com/fluencelabs/proto-distributor) is a command line interface \(CLI\) to Fluence peers allowing for the lifecycle management of services and offers the fastest and most effective way to service deployment.
```text
mbp16~(:|✔) % fldist --help
Usage: fldist <cmd> [options]
Commands:
fldist completion generate completion script
fldist upload Upload selected wasm
fldist get_modules Print all modules on a node
fldist get_interfaces Print all services on a node
fldist get_interface Print a service interface
fldist add_blueprint Add a blueprint
fldist create_service Create a service from existing blueprint
fldist new_service Create service from a list of modules
fldist deploy_app Deploy application
fldist create_keypair Generates a random keypair
fldist run_air Send an air script from a file. Send arguments to
"returnService" back to the client to print them in the
console. More examples in "scripts_examples" directory.
fldist env show nodes in currently selected environment
Options:
--help Show help [boolean]
--version Show version number [boolean]
-s, --seed Client seed [string]
--env Environment to use
[required] [choices: "dev", "testnet", "local"] [default: "testnet"]
--node-id, --node PeerId of the node to use
--node-addr Multiaddr of the node to use
--log log level
[required] [choices: "trace", "debug", "info", "warn", "error"] [default:
"error"]
--ttl particle time to live in ms
[number] [required] [default: 60000]
```
## Fluence JS SDK
The Fluence [JS SDK](https://github.com/fluencelabs/fluence-js) supports developers to build full-fledged applications for a variety of targets ranging from browsers to backend apps and greatly expands on the `fldist` capabilities.

View File

@ -10,7 +10,7 @@ In order to have a service available out-of-the-box with the necessary startup a
Note that the deployment process is a fully automated workflow requiring you to merely submit your service assets, i.e., Wasm modules and configuration scripts, in the appropriate format as a PR to the [Fluence](https://github.com/fluencelabs/fluence) repository.
At this point you should have a solid grasp of creating service modules and their associated configuration files. See the [Developing Modules And Services](../development_development/) section for more details.
At this point you should have a solid grasp of creating service modules and their associated configuration files. See the [Developing Modules And Services]() section for more details.
Our first step is fork the [Fluence](https://github.com/fluencelabs/fluence) repo by clicking on the Fork button, upper right of the repo webpage, and follow the instructions to create a local copy. In your local repo copy, checkout a new branch with a new, unique branch name:
@ -97,7 +97,7 @@ and the associated data file:
### **Scheduling Script**
Scheduling scripts allow us to decouple service execution from the client and instead can rely on a cron-like scheduler running on a node to trigger our service\(s\). For a brief overview, see [additional concepts](../development_development/development_reward_block_app/development_additional_concepts.md)
Scheduling scripts allow us to decouple service execution from the client and instead can rely on a cron-like scheduler running on a node to trigger our service\(s\). For a brief overview, see [additional concepts]()
### Directory Structure