Poster
in
Workshop: Shared Visual Representations in Human and Machine Intelligence (SVRHM)
Fast temporal decoding from large-scale neural recordings in monkey visual cortex
Jerome Hadorn · Zuowen Wang · Bodo Rueckauer · Xing Chen · Pieter Roelfsema · Shih-Chii Liu
Abstract:
With new developments in electrode and nanoscale technology, a large-scale multi-electrode cortical neural prosthesis with thousands of stimulation and recording electrodes is becoming viable. Such a system will be useful as both a neuroscience tool and a neuroprosthesis.In the context of a visual neuroprosthesis, a rudimentary form of vision can be presented to the visually impaired by stimulating the electrodes to induce phosphene patterns. Additional feedback in a closed-loop system can be provided by rapid decoding of recorded responses from relevant brain areas. This work looks at temporal decoding results from a dataset of 1024 electrode recordings collected from the V1 and V4 areas of a primate performing a visual discrimination task. By applying deep learning models, the peak decoding accuracy from the V1 data can obtained by a moving time window of 150 ms across the 800 ms phase of stimulus presentation. The peak accuracy from the V4 data is achieved at a larger latency and by using a larger moving time window of 300ms. Decoding using a running window of 30 ms on the V1 data showed only a $4\% drop$ in peak accuracy. We also determined the robustness of the decoder to electrode failure by choosing a subset of important electrodes using a previously reported algorithm for scaling the importance of inputs to a network. Results show that the accuracy of $89.6\%$ from a network trained on the selected subset of 256 electrodes is close to the accuracy of $91.1\%$ from using all 1024 electrodes.
Chat is not available.