There are 1 repository under rdf-triples topic.
The OpenLink Structured Data Editor enables editing of RDF documents (in TURTLE notation) stored in a variety of HTTP accessible documents. Actual document access requires the target document is served from a system that supports at least one of the following open standards: Linked Data Platform (LDP), WebDAV, SPARQL 1.1 Update, or the SPARQL Graph Protocol.
Convert OpenStreetMap (OSM) data to RDF Turtle, including the object geometries and predicates ogc:sfContains, ogc:sfCovers, ogc:sfIntersects, ogc:sfTouches, ogc:sfCrosses, ogc:sfEquals, and ogc:sfOverlaps. Weekly updated downloads for the whole planet (~ 80 billion triples) and per country.
Convert RDF to Semantic MediaWiki facts in MediaWiki XML format, with a standalone commandline tool
企业信用分析平台(EEAP)
💻Knowledge Base Systems (or Semantic Web Systems) Assignments for ECE NTUA
Design of the Vehicle Signal Ontology VSSo and example of use with SOSA/SSN and STEP
RDF数据的可视化展示:可展示RDF之间的关系与联系、利用SPARQL语句查询不同的数据集、针对于RDF的标签化和TripleBit与其他图数据库性能对比
Statistical Type Inference (both fully automatic and semi supervised) for RDF datasets
A simple parser for the N-Triples data format as defined by the W3C for storing RDF/Semantic information. N-Triples is a line-based, plain text format for encoding an RDF graph.
LODeXporter is a GATE component for transforming NLP annotations into RDF triples
Real-time Analytics for Gmail accounts
TORA (Topographical Register at the National Archives) is a research resource at the Swedish National Archives
SPOG, The easyest way to create Rdf graphs. Le moyen le plus simple pour créer des graphes Rdf. Graphe collaboratif pour Brainstorming, Carte Heuristique partagée, Exploration de DataLakes constitués de multiples bases de connaissances, Communication avec objet connectés,...
Based on our Paper - "Extracting RDF Triples from Raw Text"
A repository containing the resources needed to create a catalog of software model models and link them together
In the WEBNLG challenge, participants are given a set of structured data in the form of a logical form or a knowledge graph, and are asked to generate a natural language text that accurately and coherently describes the information contained in the structured data. The generated text is evaluated based on a number of criteria.
The work presented was developed during the internship, as researchers in the field of Natural Language Generation, at the Insid&s Lab laboratory in Milan-Bicocca. The work carried out deals with the creation of a framework for the correct assessment of the impact of the quality of the input datasets on the quality of the text generated by the NLG models, specifically: Creation of the "Concept-Based" and "Entity-Based" versions of the WebNLG dataset; Evaluation of the quality of the datasets created; Training of LSTM and Transformer models using the OpenNMT tool; Natural language text generation by LSTM and Transformer models; Evaluation of the quality of the text generated by the NLG models; Final analysis.
SemanticWeb, Part of a course project, Youtube Api
This project has been realized in the framework of the 85575 - Semantic Digital Libraries course of the 2nd year DHDK Master Degree a.a. 2021-22. It analyzes from a technical viewpoint, the migration of the University of Bologna AMS Historica repository specialized on digitalized historical items.
Virtualization of data through Graphviz, with the help nof NLP repo.
Node addon to parse and manipulate n3/turtle data.
id-svo-extractor: Extract SVO triples from Indonesian text.
Analysis of Aviation accidents from 2000-2009 in commercial Air Carriers
Vantage6 algorithm that retrieves the descriptive information of triplestores in a collaboration
This plugin aims to import connected database data and transform it into a geospatial data cache in the QGIS geographic information system (https://qgis.org/).
This project provides a simple Dockerized RDF endpoint that serves an .rdf file and publish a SPARQL endpoint.
extract the data from rdf/xml files
Simple repository as I learned about ontologies and how to use and explore them.