Language Technologies Ph.D. Thesis Defense
- Remote Access Enabled - Zoom
- Virtual Presentation
- YUEXIN WU
- Ph.D. Student
- Language Technologies Institute
- Carnegie Mellon University
Learning with Graph Structures and Neural Networks
Graph-based learning focuses on the modeling of graphically structured data. Significant applications include the analysis of chemical compounds based on molecular structures, the prediction of solar-energy farm outputs based on radiation sensor network data, the forecasting of epidemiology outbreaks based on geographical relations among cities and social network interactions and so on. Algorithms for graph-based learning have been developed rapidly, addressing the following fundamental challenges:
- To encode the rich information about each individual node and node combinations in a graph, a.k.a. the graph-based representation learning challenge;
- To recover missing edges when graphs are only partially observed, a.k.a. the graph completion challenge;
- To leverage active learning in graphical settings when labeled nodes are highly sparse, a.k.a. the label sparse challenge;
- To enhance the tractability of training and inference over very large graphs, a.k.a. the scaling challenge.
This thesis aims to enhance graph-based machine learning from all the above aspects via the following key contributions:
- ) Graph-enhanced neural learning for node classification (G-Net): Assuming given graphs are usually incomplete (with missing links) or noisy, directly using them in graph-based classification would be sub-optimal. Thus, we propose to jointly regularize the given graph and optimize the parameters of an Neural Network (NNet) during the training of the classifier as our new approach, which combines the strengths of neural classification and spectral transfer of the input graph.
- ) Graph convolutional matrix factorization for bipartite edge prediction: For a specific category of graphs, i.e. bipartite graphs, traditional matrix factorization methods could not effectively leverage side information such as similarity measurements within the two groups of nodes. We, therefore, propose to use graph convolutions to enhance the learned factorized representations with the structured side information for better prediction accuracy.
- ) Using Graph Neural Networks (GNNs) for general edge prediction: While GNNs have had a great success in node classification, their application to edge prediction has not archived a matching level of performance. A possible interpretation for such an phenomena is that the latent embeddings in GNNs heavily rely on the input node features and that if those input features are not of good quality, or rather noisy for the prediction tasks on hand, then sub-optimal performance would not be avoidable. We propose to address this issue by a combined use of a traditional GNN and the Transformer model which yields improved embedding of notes via flexible positional embeddings in the Transformer model.
- ) Graph-enhanced Active Learning (Graph-AL) for node classification: Active learning has been intensively studied for addressing the label sparse issue and successfully applied in text/video/audio data but not graphs. Popular AL strategies may not be directly applicable to graphs. For example, density-based document selection treats all the candidate documents as unrelated instances, ignoring the dependency structures among nodes in the input graph. We propose the first graph-based active learning approach which is tailored for Graph Neural Networks, which takes both node-internal features and cross-node connections into account for node selection in AL.
- ) Various real-world applications of large-scale graph-based learning: We have applied graph-based learning to a variety of real-world problems, including multi-graph based collaborative filtering, graph-based transfer learning across languages, graph-based deep learning for epidemiology prediction, graph-enhanced node classification, edge detection and knowledge base completion; we obtained the state-of-the-art results in each of those domains at the times.
Yiming Yang (Chair)
Huan Liu (Arizona State University)
Zoom Participation Enabled. See announcement.