BITTSy
  • Welcome to BITTSy
  • Goals and capabilities
    • Rationale
    • What can BITTSy do?
      • Headturn Preference Procedure
      • Preferential Looking Procedure
      • Visual Fixation Procedure (VFP)
      • Conditioned Headturn Procedure
  • BITTSy Basics
    • Overview
    • Protocol files
    • Trial timing structure
    • Coding infant behavior
    • Randomization of events
    • Output
  • Setup
    • System requirements and recommendations
    • Visual hardware
      • Displays
      • Lights
    • Audio hardware
    • Hardware installation guide
    • Download & setup
    • Creating stimuli for BITTSy
  • Creating protocols
    • Overview
    • Starting definitions: SIDES, LIGHTS, DISPLAYS, and AUDIO
    • Optional experiment settings
    • Tags
      • Tags referencing files
      • Groups
      • Dynamic tags
    • Phases, trials, and steps
    • Selection from a group & randomization
    • Action statements
    • Step terminating conditions
    • Loops
    • JUMP
    • Habituation
      • Setting habituation criteria
      • Meeting a criterion
      • Successful and unsuccessful trials
    • Putting it all together: Example protocols
      • Preferential looking example - word recognition
      • Preferential looking example - fast-mapping
      • Headturn preference paradigm example
      • Habituation example - familiarization to a category
      • Habituation example - word-object pairings
      • Conditioned Headturn - signal detection
  • Running protocols
    • The user interface
      • Advanced settings
    • Live coding
  • Data output
    • Detailed log files
    • The reporting module
    • Standard reports
    • Creating a custom report function
    • Using report files
  • Support
    • Version release notes
    • Troubleshooting
      • F.A.Q.
      • Setup issues documentation
        • Audio settings and channel crossover
        • Display ID numbers
        • Video or audio playback issues
    • Resources
    • Report an issue or request help
  • Citing BITTSy in publications
  • Acknowledgements
Powered by GitBook
On this page
  • Funding
  • Grant researchers
  • Beta testing sites
  • Development team
  • Documentation and manual
Export as PDF

Acknowledgements

PreviousCiting BITTSy in publications

Last updated 5 years ago

Funding

The development of BITTSy was funded by NSF BCS1152109, "New Tools for New Questions: A Multi-Site Approach to Studying the Development of Selective Attention".

Grant researchers

, University of Maryland College Park

, University of Toronto at Mississauga

, The College of Idaho

, McGill University

Beta testing sites

, University of Delaware Primary investigator - Lab manager - Emily Fritzson

, University of Arizona Primary investigator - Postdoctoral researcher -

& , University of Maryland College Park Primary investigator - Lab manager - Emily Shroads

Development team

  • , programmer, 2017-2018

  • , programmer, 2018-2019

  • , programmer, 2019-

  • , technical supervisor

  • Emily Shroads, project manager

Documentation and manual

  • Rochelle Newman

  • Giovanna Morini

  • Emily Shroads

Rochelle Newman
Elizabeth Johnson
Ruth Tincoff
Kris Onishi
Speech Language Acquisition & Multilingualism (S.L.A.M.) Lab
Giovanna Morini
Tweety Language Development Lab
LouAnn Gerken
Megan Figueroa
Language Development Lab
Canine Language Perception Lab
Rochelle Newman
Kerem Atalay
Nadiya Klymenko
Jayan Kamdar
Ed Smith