Deprecated: Function create_function() is deprecated in /customers/0/f/6/priorlabs.com/httpd.www/wp-content/plugins/revslider/includes/framework/functions-wordpress.class.php on line 257 Deprecated: Function create_function() is deprecated in /customers/0/f/6/priorlabs.com/httpd.www/wp-includes/pomo/translations.php on line 208 Deprecated: Function create_function() is deprecated in /customers/0/f/6/priorlabs.com/httpd.www/wp-includes/pomo/translations.php on line 208 Deprecated: Function create_function() is deprecated in /customers/0/f/6/priorlabs.com/httpd.www/wp-includes/pomo/translations.php on line 208 Deprecated: Function create_function() is deprecated in /customers/0/f/6/priorlabs.com/httpd.www/wp-includes/pomo/translations.php on line 208 Deprecated: The each() function is deprecated. This message will be suppressed on further calls in /customers/0/f/6/priorlabs.com/httpd.www/wp-content/plugins/js_composer/include/classes/core/class-vc-mapper.php on line 111 Warning: Cannot modify header information - headers already sent by (output started at /customers/0/f/6/priorlabs.com/httpd.www/wp-content/plugins/revslider/includes/framework/functions-wordpress.class.php:257) in /customers/0/f/6/priorlabs.com/httpd.www/wp-includes/feed-rss2-comments.php on line 8 Comments for Priorlabs http://www.priorlabs.com A Data science and machine learning lab Thu, 12 Oct 2017 20:37:14 +0000 hourly 1 https://wordpress.org/?v=4.7.15 Comment on Two stream convolutional network for predicting social matches in linkedin-data by priorlabs http://www.priorlabs.com/2017/04/06/two-stream-convolutional-network-for-predicting-social-matches-in-linkedin-data/#comment-8 Thu, 12 Oct 2017 20:37:14 +0000 http://www.priorlabs.com/?p=4298#comment-8 Good catch! Thank you.

]]>
Comment on Two stream convolutional network for predicting social matches in linkedin-data by lipn http://www.priorlabs.com/2017/04/06/two-stream-convolutional-network-for-predicting-social-matches-in-linkedin-data/#comment-7 Tue, 03 Oct 2017 12:48:41 +0000 http://www.priorlabs.com/?p=4298#comment-7 Hello, I am reading your tutorial which I found very useful. For fully connected layers, when relu activation is preferred shouldn’t the code be:

activation = tf.nn.dropout(tf.nn.relu(self.layer(x_in)), keep_prob)
instead of
activation = tf.nn.dropout(tf.sigmoid(self.layer(x_in)), keep_prob) ?

Please excuse my ignorance if I am mistaken, I am new in Deep learning.
Thank you!

]]>