Presented at Berlin.js May 31st 2018 for the #jsconfeu Special. Almost every video call begins with the same clumsy questions. Can you hear me now? Did I just turn off my camera instead of my mic now? But what if we could take the awkward troubleshooting out of the conversation, and solve it with code instead? In this talk, Ingvild Indrebø will give you a glimpse into aspects of WebRTC, WebAudio and Canvas, by showing you how she used these technologies to build a user-friendly and accessible tool to make sure you’re all set for your video call.
27. We already have access to
the microphone.
But can we hear anything?
28. “WebAudio is a system for controlling audio on
the Web, allowing developers to choose audio
sources, add effects to audio, create audio
visualizations, apply spatial effects (such as
panning) and much more…
https://developer.mozilla.org
29. Audio sources
• computed mathematically (such as OscillatorNode)
• recordings from audio/video files (<audio/>, <video/>)
• WebRTC MediaStream
45. 69-90 hz47-68 hz23-46 hz
Frequencies
0-22hz
Each bucket
contains the total
volume
of that frequency
range
46. 69-90 hz47-68 hz23-46 hz
Frequencies
0-22hz
Each bucket
contains the total
volume
of that frequency
range
47. function canWeHearYou(analyser) {
const dataArray = new Uint8Array(analyser.frequencyBinCount);
analyser.getByteFrequencyData(dataArray);
let sum = 0;
sum += dataArray.reduce((a, b) => a + b, 0);
return sum > 0;
}
54. draw(dataArray) {
const sliceWidth = CANVAS_WIDTH/dataArray.length;
let x = 0;
for(let y of dataArray) {
canvasCtx.lineTo(x, y);
x += sliceWidth;
}
55. draw(dataArray) {
const sliceWidth = CANVAS_WIDTH/dataArray.length;
let x = 0;
for(let y of dataArray) {
canvasCtx.lineTo(x, y);
x += sliceWidth;
}
canvasCtx.lineTo(canvas.width, 0);