Semantic loss in autoencoder tree reconstruction based on different tuple-based algorithms

Hiram Calvo, Ramón Rivera-Camacho, Ricardo Barrón-Fernndez

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

© Springer Nature Switzerland AG 2018. Current natural language processing analysis is mainly based on two different kinds of representation: structured data or word embeddings (WE). Modern applications also develop some kind of processing after based on these latter representations. Several works choose to structure data by building WE-based semantic trees that hold the maximum amount of semantic information. Many different approaches have been explores, but only a few comparisons have been performed. In this work we developed a compatible tuple base representation for Stanford dependency trees that allows us to compared two different ways of constructing tuples. Our measures mainly comprise tree reconstruction error, mean error over batches of given trees and performance on training stage.
Original languageAmerican English
Title of host publicationSemantic loss in autoencoder tree reconstruction based on different tuple-based algorithms
Pages174-181
Number of pages155
ISBN (Electronic)9783030011314
DOIs
StatePublished - 1 Jan 2018
EventLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) -
Duration: 1 Jan 2019 → …

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11047 LNCS
ISSN (Print)0302-9743

Conference

ConferenceLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Period1/01/19 → …

Fingerprint

Semantics
Processing
Natural Language
Batch
Data Structures
Choose

Cite this

Calvo, H., Rivera-Camacho, R., & Barrón-Fernndez, R. (2018). Semantic loss in autoencoder tree reconstruction based on different tuple-based algorithms. In Semantic loss in autoencoder tree reconstruction based on different tuple-based algorithms (pp. 174-181). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 11047 LNCS). https://doi.org/10.1007/978-3-030-01132-1_20
Calvo, Hiram ; Rivera-Camacho, Ramón ; Barrón-Fernndez, Ricardo. / Semantic loss in autoencoder tree reconstruction based on different tuple-based algorithms. Semantic loss in autoencoder tree reconstruction based on different tuple-based algorithms. 2018. pp. 174-181 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
@inproceedings{3ce019a3ee8b41a49afb0267a1c25e1f,
title = "Semantic loss in autoencoder tree reconstruction based on different tuple-based algorithms",
abstract = "{\circledC} Springer Nature Switzerland AG 2018. Current natural language processing analysis is mainly based on two different kinds of representation: structured data or word embeddings (WE). Modern applications also develop some kind of processing after based on these latter representations. Several works choose to structure data by building WE-based semantic trees that hold the maximum amount of semantic information. Many different approaches have been explores, but only a few comparisons have been performed. In this work we developed a compatible tuple base representation for Stanford dependency trees that allows us to compared two different ways of constructing tuples. Our measures mainly comprise tree reconstruction error, mean error over batches of given trees and performance on training stage.",
author = "Hiram Calvo and Ram{\'o}n Rivera-Camacho and Ricardo Barr{\'o}n-Fernndez",
year = "2018",
month = "1",
day = "1",
doi = "10.1007/978-3-030-01132-1_20",
language = "American English",
isbn = "9783030011314",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
pages = "174--181",
booktitle = "Semantic loss in autoencoder tree reconstruction based on different tuple-based algorithms",

}

Calvo, H, Rivera-Camacho, R & Barrón-Fernndez, R 2018, Semantic loss in autoencoder tree reconstruction based on different tuple-based algorithms. in Semantic loss in autoencoder tree reconstruction based on different tuple-based algorithms. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 11047 LNCS, pp. 174-181, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 1/01/19. https://doi.org/10.1007/978-3-030-01132-1_20

Semantic loss in autoencoder tree reconstruction based on different tuple-based algorithms. / Calvo, Hiram; Rivera-Camacho, Ramón; Barrón-Fernndez, Ricardo.

Semantic loss in autoencoder tree reconstruction based on different tuple-based algorithms. 2018. p. 174-181 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 11047 LNCS).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Semantic loss in autoencoder tree reconstruction based on different tuple-based algorithms

AU - Calvo, Hiram

AU - Rivera-Camacho, Ramón

AU - Barrón-Fernndez, Ricardo

PY - 2018/1/1

Y1 - 2018/1/1

N2 - © Springer Nature Switzerland AG 2018. Current natural language processing analysis is mainly based on two different kinds of representation: structured data or word embeddings (WE). Modern applications also develop some kind of processing after based on these latter representations. Several works choose to structure data by building WE-based semantic trees that hold the maximum amount of semantic information. Many different approaches have been explores, but only a few comparisons have been performed. In this work we developed a compatible tuple base representation for Stanford dependency trees that allows us to compared two different ways of constructing tuples. Our measures mainly comprise tree reconstruction error, mean error over batches of given trees and performance on training stage.

AB - © Springer Nature Switzerland AG 2018. Current natural language processing analysis is mainly based on two different kinds of representation: structured data or word embeddings (WE). Modern applications also develop some kind of processing after based on these latter representations. Several works choose to structure data by building WE-based semantic trees that hold the maximum amount of semantic information. Many different approaches have been explores, but only a few comparisons have been performed. In this work we developed a compatible tuple base representation for Stanford dependency trees that allows us to compared two different ways of constructing tuples. Our measures mainly comprise tree reconstruction error, mean error over batches of given trees and performance on training stage.

UR - https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85057286143&origin=inward

UR - https://www.scopus.com/inward/citedby.uri?partnerID=HzOxMe3b&scp=85057286143&origin=inward

U2 - 10.1007/978-3-030-01132-1_20

DO - 10.1007/978-3-030-01132-1_20

M3 - Conference contribution

SN - 9783030011314

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 174

EP - 181

BT - Semantic loss in autoencoder tree reconstruction based on different tuple-based algorithms

ER -

Calvo H, Rivera-Camacho R, Barrón-Fernndez R. Semantic loss in autoencoder tree reconstruction based on different tuple-based algorithms. In Semantic loss in autoencoder tree reconstruction based on different tuple-based algorithms. 2018. p. 174-181. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). https://doi.org/10.1007/978-3-030-01132-1_20