top of page




APP4AR is a set of technologies and APPs that allow:

- Through the use of numerous APPs (belonging to the family called PROGETTO APP) it allows a registered user to create and maintain a combination of virtual and real reality in the form of Augmented Reality;

- Through the use of a series of APPs (belonging to the family called APP of CONTENT APP) personal or designed by others;

- Through the use of a series of APIs it allows external application environments to access the APP4Ar framework to design and maintain augmented reality without using the APROGETTO APPs;

- Through the integration between different APPs in an operational context indicated with the term COOPERATIVE APPS it allows you to carry out complex tasks. As an explanatory example, the user can use the APP which records a text dictated by him and passes it to the text display for editing or integration from the keyboard. This is followed by the passage of the result to the translation APP, completed with the sending in the form of an attachment to an email being prepared, to a series of recipients identified by the email provided by them for a future follow-up during the compiling a chirp. Identification of the languages necessary for the needs of the recipients and completion of the translations. Final dispatch]

- Complete support for practically all languages: MULTI LANGUAGES APPs. Each APP is automatically configured in the language of the person browsing. This is how every message of help and dialogue happens.


The CONTENT and PROJECT APPs are available in the general APP4AR environment currently structured in two SQUARES dedicated to hosting the various APPs in virtual areas called APPLICATION CORNERS.

The CONTENT corners and the related underlying APPs are available for those who design augmented reality, and implement a series of "virtual objects" and are available to populate your own augmented reality. The non-exhaustive list includes:

- corners for playing voice and sound clips;

- corners for playing movies;

- corners for the presentation of texts;

- corners for PDF presentation;

- corners for the presentation of image collections;

- corners for reproducing 3D drawings

- corners to "interact" with the owner of the augmented reality: simple CHIRPs and BOTs (see below) and cooperative APPs;

- corners to dialogue in text chats with simultaneous translations between Augmented Reality users;

- corners for talking in voice or video chats between Augmented Reality users;

- corners to navigate within the augmented reality transferred by other users (teleportation);

- corners for "GPS georeferencing" and navigation compass;

- corners for listening to radio stations;

- corners to view programs and schedules on SMART TV channels;

- corners to receive hyper-messages both of an advertising nature and for the expansion of augmented reality;

- corners to receive and send NOTIFICATIONS of various types [Notification Ecosystem];

- corners for financial functions [ETHEREUM ECOSYSTEM], wallets and SMART CONTRACTS, accounting reports;

- corner for searching the Meta-WEB, a web where physical reality and its virtual expansion are described [CUSTOM BING SEARCH ENGINE]. The search produces references to "objects" active in the real world and inserts links into the navigation cockpit menu [NAVIGATION COGNITIVE SEARCH BASED] that allow the content to be "exploded" as if the user had interacted in the real world with the "object". active”, the outcome of the research, framing and activating it. By selecting the menu, the user navigates directly to the augmented reality below.

- corner for ADVANCED CONTENT SEARCH [generated by Intelligent Document APP, CONTENT app]: makes use of cognitive services aimed at analyzing the content of documents (texts) that the creator wishes to be subjected to such analysis. It's an option. On the search side, advanced search explores this "cognitive space" and produces links and the possible visualization of the content for the requester (under text corner and pdf)

- corner for research supported by AZURE COGNITIVE SEARCH for user search in augmented reality, and documentary content [texts, brochures];

- corner called ADVANCED EMAIL STATION where the sending of emails to others is supported by a large series of sub-corners with extensive use of COGNITIVE SERVICES FOR LANGUAGE;

- corner dedicated to the ROLES that the registered user decides to have in augmented reality. As described on the page, they allow the user to operate in augmented reality as a subject who merely uses it or as a subject who, using the content APPs, brings and maintains his presence. Some roles are incentivized by financial returns on the relevant "bank" account. Each role is signed by adhering to a SMART CONTRACT [deployment] which introduces a "utility function" mediated by the theory of ECONOMIC MECHANISMS. The ETH-ECOSYSTEM ROLES | page APP4ARSite, to which reference is made, illustrates the cost component but also the incentives present in the contractual mechanism;

- e-commerce corner: AR-STORE [ETH-STORE] where "active objects" are available for purchase to be placed in physical reality with the aim of "activating" one's augmented reality or NFTs of content to enrich augmented reality with various contents proposed by the NFT designers;

- corner for analyzing your own augmented reality in terms of "quality of presence and effectiveness" using event analysis algorithms in classic report mode;

- AI ANALYSIS corner: the user interacts according to the OPEN AI paradigms. can interrogate augmented reality and its connection with physical reality and actionable content. The "evolutionary life" that takes place in augmented reality is constantly "recorded" by APP4AR in an unstructured form and the cognitive engine is updated on the events that occur (anonymously), on the A.R. ASSETS that "are born" both on the virtual side (documents, movies...) and on the physical side (active object placed in a given physical location...), distinguishing them into the categories of private and public ASSETS;

- corner FUNCTIONAL A.I. SUPPORT: this angle allows the A.I. engine to describe an application need, and how to carry out a certain activity both aimed at the virtual and real world. TO THE. responds with a generation of text suggesting with which angles and in which sequence the requested objective can be achieved. It can also produce the A.I. NAVIGATION LIST in the cockpit. (see COCKPT chapter).


Making use of cooperation with Cognitive Services for language, the dialogue is always translated into the language and handwriting of the connected user.

- NAVIGATION in the A.R. world is an important support considering the number of objects present in the Squares and is supported under the COKPIT item.

- prediction of angles towards Web URL, allows you to access traditional WEB pages leaving the AR context. to navigate, for example, to an e-commerce website. This angle, present in various ways, allows for a cross-link A.R – WEB;

- under the corner for "manipulation" of the virtual world: rotate, vary the size, move

- prediction of angles towards IoT devices;

- …

EACH APP (both content and project) is served by a Tool available locally which works alongside it to allow the user, according to his needs, to move, rotate or change the size of the components of the angle being used.


The cockpit is the panel available for moving around in augmented reality, both during creation and use.

The user has many CAMERAS available:

- One that can be activated or deactivated as needed appears in the cockpit panel and frames the real world;

- One is the camera that displays the user's "gaze", what he sees at the point where he is, called FIRST PERSON CAM;

- Countless DIRECTION cameras that are available to carry out direct navigation or on events received from the outside world;


This panel can be turned off when not needed. Includes many features:


- Navigation using joysticks;


- Direct navigation following a functional menu that allows the user who has entered the virtual world to go directly to the corners of interest,


- Zoom in detail;


- Receive NAVIGATION NOTIFICATIONS from other USERS [NAVIGATION USERS BASED] coming from other users with whom they have expressed interest in cooperating which allow, when activated, to generate the teleportation of external augmented reality and to navigate directly to where this reality is “produced”, developed”. They are a vehicle for achieving cooperation between users. Tool for sending support and information A.R. additional, of various kinds, such as complementary, maintenance, and promotional, towards its customers. The connected user sets the appropriate filters, being able to accept only those coming from suppliers with whom he has an ongoing collaboration.


- Receive NAVIGATION NOTIFICATIONS from other ADVERTISERS [NAVIGATION ADVERTIZER BASED], to convey advertising messages and content generated by a user who adheres to APP4AR's advertising plans and assumes the role of ADVERTIZER. The user selects appropriate filters on the product classification [NACE 1 & 2 EUROSTAT Nomenclature statistique des activités économiques];

- Receive NAVIGATION NOTIFICATIONS following your search in the META [NAVIGATION COGNITIVE SEARCH BASED]. The cockpit navigation menu contains links to "active objects" resulting from the search which can therefore be activated in a surrogate way without the need for interaction in physical reality.


- Receive NAVIGATION NOTIFICATIONS following the outcome of the functional help dialogue with OPEN A.I. [NAVIGATION A.I. BASED] where direct links appear to the virtual space where there are the angles necessary to carry out the task produced by the text suggested by the A.I. engine. (see A.I. FUNCTIONAL SUPPORT);


- Receive NAVIGATION NOTIFICATIONS received from external application environments through the use of specifications. API [NAVIGATION API BASED]. External applications can make use of the API for this purpose or other content and project purposes after having assumed the role of API CLIENT by receiving the appropriate authorizations and identification methods. The navigation message is adapted to the language of the user currently connected and carries the "active object" to develop the content that is presented at the end of direct navigation.



Let's mention PANEL 08. This panel is inserted in almost all the stands in the Squares, on the right side, at the back. It performs a series of special functions: in fact, it represents an external or additional presence space of the augmented reality present at all times.

When present at a given moment, the augmentation of reality generated by an "augmented reality producer", often defined with the term A.R. OWNER, presented to the connected user, usually indicated with the term WIF Who Is Framing, the producer can decide what use to make of the port represented by the Panel.

He can:

- Deciding to accept the arrival of advertising by defining the category limits [NACE 1 &”]

- Deciding to use it to expand reality increases insights am often to develop a second-level presentation, which includes CHIRP

CHIRPs are content APPs that allow you to communicate with the connected user, WIF, who is exploring the instance of augmented reality produced by the encounter with an active object owned by the AR. OWNER.

CHIRPs can be in the simple form or with the remaining support of a BOT.

In absolute respect of the privacy of the WIF, he can communicate via the CHIRP by providing or requesting information.

  • 1. CHIRP is always instantiated in the WIF language and spelling.

  • 2. In dialogue it receives the cooperative support of the APPs:

  • 3. Cognitive language services for translations into the AR language. OWNER;

  • 4. voice services to record voice messages;

  • 5. files browser access its local resources;

  • 6. blobs browser to access its resources in the Cloud;

  • 7. to become a CONTACT and thus be able to receive greater information or other support from the A.R. OWNER;

  • 8. if CHIRP of BOT type the A.R. OWNER has created, powered and maintained an OPEN A.I. ENGINE dedicated to each single CHIRP. This represents a powerful background for communicating, within the CHIRP session, with the WIF. The chatbot-type window included in the chirp itself provides the WIF with the opportunity to communicate with A.I. on how much and how the A.R. OWNER wishes to inform support and assist its client.

The CHIRP data are collected and in particular, constitute a structured basis for the ADVANCED EMAIL APP service for all marketing activities

Please refer to the demo (imminent demo on on APP ADVANCED EMAIL which shows how in an email marketing campaign, the A.R. OWNER queries all events related to CHIRPS and can proceed to send emails and attachments to the various compilers of the specific CHIRPs it is examining [these are a de facto collection of the interactions that occurred in the A.R. world], pertinent to the purpose and to the subject, product, service, based on and treated by each CHIRP. The cooperative support APP manages the translations. THE BLOBS or BROWSER cooperative APP manages the documents. The Record and Speech cooperative APP manages the vocal nature of the interaction.


These are dedicated to creating, maintaining and publishing augmented reality and connecting it with the real world.

We list some corners where these APPs work. More details at

- Corner to define your user profile (free services but with some limits) or customer (paid, every active service) and all the data essential for use;


- Angle to create an active image. Starting from your own logo or image you want to create an active object. The APP allows you to choose the candidate image and:

  • o Check that the identification is of good quality (ranking based on a scale of 1 to 5 stars);

  • o Check that it does not collide with other active images of yours or other users;

  • o Add a watermark for copyright reasons;

  • o Decide whether to use it in one location or in many different locations in the real world;

  • or

- Angle to define special images: pilot or GPS-supported images;


- Corner to define the use of special markets VUFORIA Markers: these are markers supplied to each user and are already active. These are used to create and manage personal augmented reality, they are used to create your own A.R. world. [for example to connect the practices of a law firm, doctor...]

- Corners to create ACTIVE BUSINESS CARDS [personal, professional, business];

- Corners to create and manage advanced TXT & VOCAL TASKS;

- Corners to create COGNITIVE TASKS OF LANGUAGE: by typing a text, producing the translation in all languages and reading in them.


- Corner to access PDF files. Maintaining the original graphics, generate versions for all languages and spellings;


- Corner to create Radio Stations;


- Corner to create a SMART TV and related schedule;


- Angle to create simple CHIRPs;


- Angle to create CHIRP assisted by AI BOTs and power the AI. Underlying Engine;



- Angle to create blob voice clips from the microphone;


- Angle to create blob lyrics from microphone;


- Angle to create text blobs from dictated clips;


- Corner to create movies;


- -Angle to define the locations: physical (with GPS coordinates via GPS cooperative APP), geo-reference calculation or virtual (e.g. advertising campaign in a magazine or newspaper, or on a generalist TV channel) or mixed in a SMART schedule physical or virtual TV;


- Corner to enter your own structure, and assets in the META-WEB;


- Corner to define the presence of public and private ASSETS by feeding your AI Cognitive Engines. Dedicated Engines;

- Corner to activate text, voice and video chats;


- Corner to create presentations of 3D objects, to be reproduced in virtual space, using GLFT blobs [glTF - Wikipedia]. GLFT is the standard adopted in APP4AR and can be easily obtained from all the most popular 3D CAD systems;



- Corner to define the advertising plan (if with advertiser role) with the available priority options;


- Corner to assume roles of interest, generate wallets, accounts and sign SMART CONTRACT;


- Corner to send notifications (Notification Ecosystem)


- Corner to create virtual content albums and linked expansions;


- Angle to define the DISPATCHERS and their connection, association with active objects and content albums: the dispatcher has the role of identifying the language and preferences of the WIF that will frame the "active object" and routing it to the album appropriate for its characteristics.


- Corner to create an NFT specifying the characteristics of the SMART CONTRACT for sale and insertion in the ETH-STORE, APP4AR e-commerce;


- Corner to create collections of already active thematic images ready for use by buyers and specifying the characteristics of the SMART CONTRACT for sale and insertion in the ETH-STORE, APP4AR e-commerce;


- Corner for the creation of 3D GADGETS by sending CAD drawings:


- Corner to create PROMOTIONAL CAMPAIGNS with discount BONUSES or through incentives in Ethereum crypto;


- Corner for sending advertising messages (if the Advertiser role is assumed and active)


- ……………



The APP store allows you to purchase NFTs or collections of pre-packaged active images or 3D gadgets.

More details at



APP4AR creates an integrated environment for augmented reality including the intensive use of innovative technologies mainly linked to AI of cognitive services aimed at language, generative text, image recognition, and speech processing.

APP4AR introduces the 3D approach essential for the creation of the virtual part of Augmented Reality equipped with space emulation from the perspective and navigation side, including strong use of visual and vocal support effects.

An important aspect is navigation in the virtual world which must make use of valid support to facilitate the identification of the areas and functions of interest which are "dispersed" in virtual reality.

This approach replaces and enhances in evolutionary terms the standard in use in WEB 2 based on the use of text links to navigate from page to page.

  The adopted Model instead "navigates" in 3D space, no longer just 2D, and moves from application angles, i.e. between 3D spaces served by one or more cooperative APPs where the pages under only a lower-ranking component (a User Interface), and also in the background A.I. modules. increasingly sophisticated and pervasive, they add support.

  This world is always connected to the real world, the connection ensured by the active objects, a connecting bridge between the two sides of the representation.

This 3D world is all present, every corner of it is listening to the events that develop and arrive from the real and virtual world.

Some of the APPs present group together a set of functions that make them candidates to become standalone APPs with functions that can also be used outside of the overall space.

The A.R. space of app4AR is connected, through its APPs, to cloud services aimed at the traditional data sector

/Database and Blobs containers), to functions processed on cloud servers, to the most advanced modules for AI aimed at language, voices, images and generative engines.

This, for example, allows each APP present to be multi-lingual and thus the related support messages.

The APP4AR space overcomes every language and writing barrier in a practically automatic way.

Thus the cooperation between the various APPs somehow merges the various textual and vocal, written and spoken expressions.

Further developments are underway to include images using Microsoft Azure's innovation in Media Services.

The presence of access channels via Rest API offers the potential to connect the two components of the A.R. and its ancillary components, also to external applications, a third link.

A further point worthy of note is the strong integration through xAR token with the crypto payment approach both on the side of costs, incentives, and promotions and in general of SMART CONTRACT as an economic mechanism. This allows you to carry out debit or incentive microtransactions in an automatic, centralized, verifiable, immutable, notarized DLT way, grouping them in an efficient and low-overhead way.

Possibility of creating private Ethereum networks to support some functions (for example Remote Patient Monitoring uses Ethereum net on a private Amazon Cloud).


- Evaluation of the market and of interest and know-how for the use of APP4Ar and its native PROJECT modules for the sectors of:

- Tourism

- Education

- E-commerce: Content products conveyed with NFTs (e.g. teaching, art, museums, fairs,

- E-commerce: Accessory services such as production of gadgets with 3D Printer (service)

- E-commerce: Creation of CUSTOM STANDS that can be inserted for replacement in the aPP4Ar squares, using 3D Scanner services

- E-commerce: production of active thematic collections for product sectors;

- Informational (e.g. acceptance of treatment centres, administration, help desks...)

- Advertising (active images on SMART TVs placed in shop windows, active images inserted within standard messages on generalist TVs, magazines, newspapers...)

- Editorial production of thematic NFTs

- Virtual publishing

- Health profession: A.R. model for patient care

- Profession: A.R. model for customer activities

- Remote Patient Monitoring (a project already available)

The above may find interest in being developed through traditional web portals connected to augmented reality through the use of the Rest API.

With this we intend to evaluate the development of vertical application lines (in the sectors indicated above or in other better-identified ones) which can be implemented through the creation of dedicated portals of the classic type of development and presentation where the specific needs of a target sector are tailored and the APIs behind it feed the A.R. image. and the connection with the real and virtual 3D world.

Anticipating this approach, APP4AR already exists in two versions:

- APP4AR full where all the PROJECT and CONTENT APPs operate (a little restricted

- APP4ArVIEW aimed at the content most expanded in space.

APP4ArVIEW operates in augmented reality by ignoring whether the content comes from development using the native APP project corners in APP4Ar or generated from classic 2D thematic portals through the use of the "feeder" APIs by the portals and their users.

There remains the possibility of concentrating the market development activity on some APPs which have such consistency and generality that they can be usefully used in a standalone version.

Lastly, the APP4AR environment allows you to develop specific APPs if this aspect appears of interest, even upon completion. APP4AR holds a library of modules to access all Microsoft Azure services, Open A.I., Ethereum, Vuforia and other relevant cloud providers, integrated and tested modules for the Unity 3D environment and therefore suitable for the production, in a reasonably short time, of customized APPs.

The environment is developed in C#, Python, and Solidity.

The structure is object-based according to the dominant Unity 3D and/or Azure Services framework scheme.

Details NAVIGATION algorithm facilities

bottom of page