All Stories

  1. Gamification Enhances User Engagement and Task Performance in Prosthetic Vision Testing
  2. Look, predict, intercept: Visual exposure seeds model-based control in moving-target interception
  3. Static or Temporal? Semantic Scene Simplification to Aid Wayfinding in Immersive Simulations of Bionic Vision
  4. Deep Learning–Based Control of Electrically Evoked Activity in Human Visual Cortex
  5. Distinct Roles of Central and Peripheral Vision in Rapid Scene Understanding
  6. Simulated prosthetic vision confirms checkerboard as an effective raster pattern for epiretinal implants
  7. A Deep Learning Framework for Predicting Functional Visual Performance in Bionic Eye Users
  8. Axonal stimulation affects the linear summation of single-point perception in three Argus II users
  9. Explainable machine learning predictions of perceptual sensitivity for retinal prostheses
  10. Axonal stimulation affects the linear summation of single-point perception in three Argus II users
  11. Efficient multi-scale representation of visual objects using a biologically plausible spike-latency code and winner-take-all inhibition
  12. Explainable Machine Learning Predictions of Perceptual Sensitivity for Retinal Prostheses
  13. Retinal ganglion cells undergo cell type–specific functional changes in a biophysically detailed model of retinal degeneration
  14. Towards a Smart Bionic Eye: AI-powered artificial vision for the treatment of incurable blindness
  15. The Relative Importance of Depth Cues and Semantic Edges for Indoor Mobility Using Simulated Prosthetic Vision in Immersive Virtual Reality
  16. Factors affecting two-point discrimination in Argus II patients
  17. Cortical Motion Perception Emerges from Dimensionality Reduction with Evolved Spike-Timing-Dependent Plasticity Rules
  18. Towards a Smart Bionic Eye: AI-Powered Artificial Vision for the Treatment of Incurable Blindness
  19. Improving bionic vision with deep learning
  20. Immersive Virtual Reality Simulations of Bionic Vision
  21. Greedy Optimization of Electrode Arrangement for Epiretinal Prostheses
  22. Learning to see again: Perceptual learning of simulated abnormal on- off-cell population responses in sighted individuals
  23. A Computational Model of Phosphene Appearance for Epiretinal Prostheses
  24. A Computational Model of Phosphene Appearance for Epiretinal Prostheses
  25. Explainable AI for Retinal Prostheses: Predicting Electrode Deactivation from Routine Clinical Measures
  26. Explainable AI for Retinal Prostheses: Predicting Electrode Deactivation from Routine Clinical Measures
  27. Deep Learning–Based Scene Simplification for Bionic Vision
  28. Towards Immersive Virtual Reality Simulations of Bionic Vision
  29. U-Net with Hierarchical Bottleneck Attention for Landmark Detection in Fundus Images of the Degenerated Retina
  30. Data-driven models in human neuroscience and neuroengineering
  31. Model-Based Recommendations for Optimal Surgical Placement of Epiretinal Implants
  32. Neural correlates of sparse coding and dimensionality reduction
  33. A model of ganglion axon pathways accounts for percepts elicited by retinal implants
  34. Commentary: Detailed Visual Cortical Responses Generated by Retinal Sheet Transplants in Rats With Severe Retinal Degeneration
  35. Biophysical model of axonal stimulation in epiretinal visual prostheses
  36. On the potential role of retinal sheet transplants for sight restoration
  37. Model-Based Recommendations for Optimal Surgical Placement of Epiretinal Implants
  38. A model of ganglion axon pathways accounts for percepts elicited by retinal implants
  39. Biophysical model of axonal stimulation in epiretinal visual prostheses
  40. CARLsim 4: An Open Source Library for Large Scale, Biologically Detailed Spiking Neural Network Simulation using Heterogeneous Clusters
  41. Learning to see again: biological constraints on cortical plasticity and the implications for sight restoration technologies
  42. Sparse coding and dimensionality reduction in cortex
  43. pulse2percept: A Python-based simulation framework for bionic vision
  44. Learning to see again: Biological constraints on cortical plasticity and the implications for sight restoration technologies
  45. pulse2percept: A Python-based simulation framework for bionic vision
  46. Our brains help us navigate by using a compressed code for what we see when we move
  47. A GPU-accelerated cortical neural network model for visually guided robot navigation
  48. CARLsim 3: A user-friendly and highly optimized library for the creation of neurobiologically detailed spiking neural networks
  49. Vision-based robust road lane detection in urban environments
  50. Efficient Spiking Neural Network Model of Pattern Motion Selectivity in Visual Cortex
  51. GPGPU accelerated simulation and parameter tuning for neuromorphic applications
  52. Categorization and decision-making in a neurobiologically plausible spiking network using a STDP-like learning rule
  53. Exploring olfactory sensory networks: Simulations and hardware emulation