本文共 3567 字,大约阅读时间需要 11 分钟。
作者:chen_h
微信号 & QQ:862251340 微信公众号:coderpai1.【代码】Visual Question Answering in Pytorch
简介:
This repo was made by (LIP6) and (LIP6-Heuritech), two PhD Students working on VQA at and their professors (LIP6) and (LIP6-CNAM). We developped this code in the frame of a research paper called which is (as far as we know) the current state-of-the-art on the .
The goal of this repo is two folds:If you have any questions about our code or model, don’t hesitate to contact us or to submit any issues. Pull request are welcome!
原文链接:
2.【博客】Why Does Deep Learning Not Have a Local Minimum?
简介:
Yes, there is a ‘theoretical justification’, and has taken a couple decades to flush it out.
I will first point out, however, it has been observed in practice. This was pointed out by LeCun in his early work on LeNet, and is actually discussed in the ‘orange book’, “Pattern Classification” by David G. Stork, Peter E. Hart, and Richard O. Duda.
原文链接:
3.【博客】Graph-based machine learning: Part I
简介:
During the seven-week * recent grads and experienced software engineers learn the * by building a data platform to handle large, real-time datasets.*
* (now a Data Science Engineer at ) discusses his project on community detection on large datasets.
原文链接:
4.【博客】Deep Learning the Stock Market
简介:
In the past few months I’ve been fascinated with “Deep Learning”, especially its applications to language and text. I’ve spent the bulk of my career in financial technologies, mostly in algorithmic trading and alternative data services. You can see where this is going.
I wrote this to get my ideas straight in my head. While I’ve become a “Deep Learning” enthusiast, I don’t have too many opportunities to brain dump an idea in most of its messy glory. I think that a decent indication of a clear thought is the ability to articulate it to people not from the field. I hope that I’ve succeeded in doing that and that my articulation is also a pleasurable read.
原文链接:
5.【论文】Deep Learning in Trading
简介:
* Current state of the art *
LSTM is theholygrail of sequencepredictions.
A major part of thefinancial modellingis sequenceprediction- whether that?s volatility modelsor volume modelsor thetoughest oneof all - returnprediction models. This is theunderlyingtask insuchproblems - Givenasequenceof values, can wepredict thenext number inthe sequence? LSTM modelsnaturally fit this criteria,becauseof its recursivenature.Additionally,thehiddenstateandthe memory cell tremendouslyhelpretaintheuseful featuresof thesequence.Featureengineeringis thethingof thepast intheeraof neural networks.
Neural networksarereallygoodat comingup withfeaturesontheir own.A number of peopleinfinance work day-in-day-out incomingup withfeatures.Neural netsarepoisedtotakeover this segment of the market.Neural networksprovideaneasy way tocombine market dataandother datasources.
Sinceneural netswork inthelatent space,it?s super easy tocombinedyour market datainput withother datasources you might have.That canbeanythingfrom sentiment analysis, summaryof SEC filings tovisual or audioinputs.Additionally,neural networksmakeit easy todo multivariate modelling wheretherearealot of relationships
betweeninputs,andthereisatime-varyingnaturetoit.It?s important tounderstand when neural networksdonot work - theydon?t work if youdon?t haveenoughdata.
Small datasetsarebottleneckswhenit comes toconvergence.Largedatasets come withcomputationproblems.原文链接:
转载地址:http://aldqb.baihongyu.com/