Open-sourcing PyTorch-BigGraph for faster embeddings of extremely large graphs

Open-sourcing PyTorch-BigGraph for faster embeddings of extremely large graphs

Graph embeddings, like text embeddings, construct random “false” edges as negative training examples along with the true positive edges. First, we observed that in traditional methods used to generate graph embeddings, almost all the training time was spent on the negative edges. PBG allows the entire AI community to train embeddings for large graphs — including knowledge graphs, as well as other large graphs such as graphs of stock transactions, online content, and biological data — without specialized computing resources like GPUs or huge amounts of memory.

Source: ai.facebook.com