爱可可AI论文推介(10月13日)

LG - 机器学习 CV - 计算机视觉 CL - 计算与语言
1、 [CV]*GRF: Learning a General Radiance Field for 3D Scene Representation and Rendering
A Trevithick, B Yang
[Williams College & University of Oxford]
仅通过2D观察在单一网络中表示和渲染任意复杂3D场景的隐式神经函数(GRF) , 利用多视图几何原理 , 从观察到的二维视图获得内在表示 , 将二维像素特征精确映射到三维空间 , 保证学习到的隐式表示有意义 , 并在多视图间保持一致 , 利用注意力机制 , 隐式解决视觉遮挡问题 。 利用GRF , 可合成真实的二维新视图 。
We present a simple yet powerful implicit neural function that can represent and render arbitrarily complex 3D scenes in a single network only from 2D observations. The function models 3D scenes as a general radiance field, which takes a set of posed 2D images as input, constructs an internal representation for each 3D point of the scene, and renders the corresponding appearance and geometry of any 3D point viewing from an arbitrary angle. The key to our approach is to explicitly integrate the principle of multi-view geometry to obtain the internal representations from observed 2D views, guaranteeing the learned implicit representations meaningful and multi-view consistent. In addition, we introduce an effective neural module to learn general features for each pixel in 2D images, allowing the constructed internal 3D representations to be remarkably general as well. Extensive experiments demonstrate the superiority of our approach.
爱可可AI论文推介(10月13日)文章插图
爱可可AI论文推介(10月13日)文章插图
爱可可AI论文推介(10月13日)文章插图
2、[LG]*No MCMC for me: Amortized sampling for fast and stable training of energy-based models
W Grathwohl, J Kelly, M Hashemi, M Norouzi, K Swersky, D Duvenaud
[Google Research & University of Toronto]
大规模训练基于能量模型(EBM)的简单方法(VERA) , 用熵正则化发生器摊平用于EBM训练的MCMC采样 , 采用快速变分近似改进基于MCMC的熵正则化方法 。 将估计器应用到联合能量模型(JEM)中 , 在性能保持不变的情况下训练速度大大加快 。
Energy-Based Models (EBMs) present a flexible and appealing way to representuncertainty. Despite recent advances, training EBMs on high-dimensional dataremains a challenging problem as the state-of-the-art approaches are costly, unstable, and require considerable tuning and domain expertise to apply successfully. In this work we present a simple method for training EBMs at scale which uses an entropy-regularized generator to amortize the MCMC sampling typically usedin EBM training. We improve upon prior MCMC-based entropy regularization methods with a fast variational approximation. We demonstrate the effectiveness of our approach by using it to train tractable likelihood models. Next, we apply our estimator to the recently proposed Joint Energy Model (JEM), where we matchthe original performance with faster and stable training. This allows us to extend JEM models to semi-supervised classification on tabular data from a variety of continuous domains.


推荐阅读