How I used INFORMATION THEORY to choose better Keybindings

Sammy Engineering
21 Mar 202413:40

Summary

TLDRこの動画は、情報理論を使ってEmacsのキーバインドを最適化する方法を紹介しています。キーストロークを情報として見なし、エントロピーを計算し、ハフマン符号化を用いてより効率的なキーバインドを見つけ出します。動画では、作者が実際に使用するEmacsコマンドの頻度を計測し、そのデータを分析して情報理論の原則を応用する方法を示しています。最適なキーバインドを選ぶことで、入力するコマンドを最小限のキーストロークで伝えることができることを示しています。

Takeaways

  • 📚 情報理論の概念を利用して、Emacsのキーバインドを最適化する。
  • 🕵️ キーストロークとEmacsを通信チャネルで送信する情報として見る。
  • 🔢 手のエントロピーを計算し、ハフマン符号化を创造する。
  • 🎯 自分のより効率的なキーバインドを考案するためにハフマン符号化を使用する。
  • 📈 クロード・香农が電報でメッセージをより効率的に送信する方法を考えるために情報理論を考案。
  • 📊 符号化の長さを信号の発生確率に比例させることが最適な符号化であると香农は結論づけた。
  • 🔧 自分のEmacs構成にキーロガーを追加し、過去2ヶ月以上実行したEmacsコマンドのデータを収集する。
  • 📊 データ解析の結果、最も使用频率の高いコマンドを特定し、効率的なキーバインドを設定する。
  • 🌟 情報理論は最適な符号化方法だけでなく、最適解に到達する道を示す。
  • 🔄 現在のキーバインドの長さと理想的なキーバインドの長さの差を計算し、改善の余地を特定する。
  • 🔄 使用頻度が高いコマンドに対して長すぎるキーバインドを、使用頻度が低いコマンドに対して短すぎるキーバインドを再割り当てる。
  • 🛠️ GitHubで分析用のコードを公開し、個人のEmacs設定に適用して結果を共有するよう呼びかけている。

Q & A

  • この動画で取り上げられている情報理論とは何ですか?

    -情報理論は、データの通信や処理の効率を最大化するための理論です。この動画では、情報理論を用いてEmacのキーバインドを最適化する方法について説明されています。

  • クロード・シャノンはどのような状況で情報理論を考案しましたか?

    -クロード・シャノンは、1940年代にベル研究所で働いていた際に、電報でメッセージをより効率的に送信する方法を考案するために情報理論を発明しました。

  • Huffmanコードとは何ですか?

    -Huffmanコードは、符号化されたデータのエントロピーを最小限に抑えるための最適な符号長を算出するアルゴリズムです。この方法は、より一般的な文字に较短い符号を割り当て、レアな文字に较长の符号を割り当てることで、平均的な符号長を短くします。

  • この動画で収集されたデータはどのように解析されましたか?

    -動画の作成者は、Emacで使用されるコマンドの出現確率を計算するために、キーロガーをEmacに追加しました。その後、収集されたデータを解析し、各コマンドの使用頻度を算出しました。

  • 情報理論に基づく最適なキーバインドの設定方法は何ですか?

    -情報理論に基づく最適なキーバインドは、各コマンドが発生する確率に基づいて符号長を割り当てることです。これにより、平均的な符号長を最小限に抑えることができます。

  • この動画の分析で見つかった最も使用頻度が高いコマンドは何ですか?

    -この動画の分析で見つかった最も使用頻度が高いコマンドは、自身挿入(self-insert)コマンドです。これは、文字をバッファに入力するという基本的な操作です。

  • 分析結果から明らかな問題は何ですか?

    -分析結果から明らかな問題は、効率的な符号化が意味を失うことです。たとえば、使用頻度が高いコマンドが複雑なキーシーケンスに割り当てられている場合、ユーザーは覚えることが困難で、直感的でないキーバインドになります。

  • 最適なキーバインドを実際に適用するために、どのような手順を取ること最重要ですか?

    -最適なキーバインドを適用するためには、使用頻度が高いコマンドに対して短いキーシーケンスを割り当て、使用頻度の低いコマンドに対して長いキーシーケンスを割り当てる必要があります。また、意味を失うことなく、キーバインドを適切に再割り当てることも重要です。

  • この動画の分析で得られた結果を実際にEmacに適用するために、何が必要でしたか?

    -実際にEmacに適用するためには、分析で得られた最適なキーバインドを現在のキーバインド設定と比較し、必要な変更を行った後、新しい設定に変更する必要がありました。

  • 情報理論を応用して得られた分析結果は、どのようにUIデザインやHCIに影響を与える可能性がありますか?

    -情報理論を応用した分析結果は、メニューアイテムの配置やマルチステップメニューの組織など、一般的なUIデザインとHCIにおける最適なレイアウトを見つけるための有力なツールとなる可能性があります。

  • この動画の作成者が今後取り組みたいと思っているプロジェクトは何ですか?

    -この動画の作成者は、情報理論とログ記録、分析を用いて、メニューアイテムの配置やマルチステップメニューの組織を決定するウェブサイトを設計したいと考えています。また、この動画で使用したコードを使いやすいライブラリにパッケージ化し、ユーザーインターフェースを提供する予定です。

Outlines

00:00

📚 情報理論を使ったEmacキーバインディングの最適化

この段落では、情報理論の概念を用いてEmacのキーバインディングを最適化するプロセスが説明されています。情報理論は、キーストロークとEmacを通信チャネル上の情報として見なし、エントロピーを計算し、ハフマン符号化を创造的に行うことにより、より効率的なキーバインディングを見つけ出します。また、1940年代にクロード・シャンノンが電報でメッセージをより効率的に送信する方法を考案した際に情報理論が発見された歴史も触れられています。

05:00

🔍 Emacコマンドの確率分布とエントロピーの計算

この段落では、Emacコマンドの使用頻度をデータとして収集し、確率分布を変換し、エントロピーを計算する方法が説明されています。自身のEmac設定にキーロガーを追加し、2ヶ月以上実行することで、コマンドの入力時間、主モード、押されたキーを記録したログファイルが得られました。このデータを使って、最も使用频率の高いコマンドを特定し、情報理論に則った符号化を適用する方法が解説されています。

10:00

💡 現在のキーバインディングと理想的な符号化の比較

最後の段落では、現在のEmacのキーバインディングと情報理論に則った理想的な符号化との比較が行われています。使用頻度が高いコマンドに対して、現在のキーバインディングが長く、効率性が低いことが明らかになります。逆に、使用頻度が低いコマンドに対しては短いキーシーケンスが割り当てられていることが指摘されています。この分析に基づき、効果的にEmacを使用するためのキーバインディングの変更が提案されています。

Mindmap

Keywords

💡情報理論

情報理論は、データの通信や処理の効率を最大化するための理論です。この動画では、情報理論を用いてEmacのキーバインディングを最適化する方法について説明されています。キーストロークやEmacコマンドを通信チャネルで送る情報として捉え、エントロピーを計算し、ハフマン符号化を適用することで、より効率的なキーバインディングを見つけ出します。

💡エントロピー

エントロピーは、情報理論において、情報の不確実性や予測不可能性を定量化するための重要な概念です。この動画では、エントロピーを計算して、Emacエディタのコマンドを最適なキーバインディングにエンコードする方法を探求しています。エントロピーが高いほど、情報の不確実性が高いことを意味し、逆にエントロピーが低いほど、情報の予測が可能で不確実性が低くなります。

💡ハフマン符号化

ハフマン符号化は、符号化された情報のエントロピーを最小限に抑えるためのエンコード技術です。この方法は、出現頻度が高いシンボルに短いビット列を割り当て、出現頻度が低いシンボルには長いビット列を割り当てることで、通信の効率を改善します。動画では、ハフマン符号化を用いてEmacのキーバインディングを最適化する方法が解説されています。

💡キーバインディング

キーバインディングとは、コンピューターのソフトウェアやハードウェアにおいて、特定の機能を実行するために押すべきキーの組み合わせを指します。この動画では、Emacエディタのキーバインディングを最適化し、より効率的なキー操作を実現することを目的としています。

💡Emac

Emacは、GNUプロジェクトによって開発されたテキストエディタで、Emacsと似た機能を持ちますが、より軽量でシンプルなインターフェースを持っています。この動画では、Emacのキーバインディングを最適化する方法について説明されています。

💡キーストローク

キーストロークとは、コンピューターのキーボードで特定のキーを押すことを指します。この動画では、キーストロークを情報理論の観点から分析し、より効率的なキーバインディングを作成する方法を説明しています。

💡最適化

最適化は、システムやプロセスの性能を最大化するためのプロセスです。この動画では、情報理論を用いてEmacのキーバインディングを最適化し、効率的なキー操作を実現することを目指しています。

💡クレア・シャンノン

クレア・シャンノンは、情報理論の奠基者であり、電気通信の効率化に貢献した科学者です。この動画では、彼が電報でメッセージをより効率的に送信する方法を研究した際に情報理論を発見したと説明されています。

💡God Mode

God Modeは、Emacにおいて使用されるVimスタイルのコマンドモードです。このモードでは、一連のキーストロークを組み合わせることで、より高度な編集操作を行うことができます。

💡解析

解析とは、複雑なデータや現象を理解するために、それを分解し、その要素やパターンを研究するプロセスです。この動画では、Emacのキーバインディングの使用頻度を記録し、解析することで、より効率的なキーバインディングを見つけ出す方法が紹介されています。

💡GitHub

GitHubは、ソフトウェア開発において広く使用されるWebベースのバージョン管理システムです。この動画では、情報理論に基づくキーバインディング最適化のコードがGitHubで公開されており、視聴者は自身のEmac設定で同じ分析を試すことができます。

Highlights

The video discusses using information theory to optimize key bindings in Emacs.

Information theory is introduced as a way to view keystrokes in Emacs as information sent over a channel.

The concept of entropy of hands is calculated to improve efficiency.

Huffman encoding is used to create more efficient key bindings.

Information theory was discovered in the 1940s by Claud Shannon at Bell Labs.

The telegraph system and its encoding methods are explained as an analogy for key bindings.

The video demonstrates the process of calculating the entropy of Emacs commands.

A keylogger is used to collect data on the frequency of command usage over two months.

The video shows how to convert frequencies into a probability distribution and calculate entropy.

The average information needed to send commands is found to be 6.5 bits per command.

The video discusses the limitations of using binary notation for key bindings.

The importance of maintaining semantic information in key bindings is emphasized.

The video presents a method to find the optimal length of key bindings based on their usage frequency.

A method to identify and reassign inefficient key bindings is proposed.

The video suggests potential applications of information theory in UI design and HCI.

The creator plans to develop a library with a user interface for easy application of the methods discussed.

The video encourages viewers to run the analysis on their own Emacs configurations.

The video concludes with a call to action for likes, subscriptions, and viewer engagement.

Transcripts

play00:00

all right so today we're going to be

play00:01

using Concepts from information Theory

play00:03

to optimize our key bindings in emac if

play00:06

you don't know what that is yet don't

play00:07

worry uh I'm going to give an intro to

play00:09

information Theory as part of this video

play00:11

but the tldr is that we're going to view

play00:13

our keystrokes and emac as information

play00:16

sent over a channel and then we're going

play00:18

to calculate the entropy of my hands

play00:20

create a Huffman encoding and then we're

play00:22

going to use that encoding to see if I

play00:24

can come up with more efficient key

play00:25

bindings for myself so information

play00:27

theory was discovered in the 1940s when

play00:29

Claud Shannon was working at Bell labs

play00:31

to come up with a more efficient way to

play00:32

send messages over the telegraph you see

play00:34

the telegraph was an ancient way of

play00:36

sending messages way back before the

play00:37

telephone and it worked like this if you

play00:40

had a message you wanted to send you

play00:41

would encode each of the letters in the

play00:43

message as a sequence of dots and dashes

play00:45

and then you would send that sequence of

play00:46

dots and dashes over the wire on the

play00:48

other end someone would receive your

play00:49

sequence of dots and dashes and use it

play00:51

to decode the original message now there

play00:53

are many different ways to encode

play00:54

letters as sequences of dots and dashes

play00:56

you're probably familiar with things

play00:57

like asky encoding but people figured

play00:59

out very quickly that if they used

play01:01

shorter sequences for the most common

play01:02

letters and longer sequences for the

play01:04

most unusual letters on average they'd

play01:06

be able to send their messages faster so

play01:08

Claud Shannon sat down and he thought

play01:10

about this and he wanted to know what is

play01:12

the best encoding and how much does it

play01:14

cost on average to send a message over

play01:16

the wire to get an intuition for this

play01:18

let's take a look at a simple encoding

play01:20

where every character uses a sequence

play01:22

that's the same length if you have 10

play01:24

bits then you can encode two to the 10th

play01:26

different symbols uh which you can see

play01:27

either by using binary notation or by by

play01:30

using combinatoric since there are two

play01:31

choices for each bit going the other way

play01:33

if you want to encode two to the 10

play01:35

different symbols you'll need to use at

play01:36

least 10 bits log 2 of 2 the 10th so

play01:39

more generally if you want to encode K

play01:41

different symbols you need at least log

play01:43

two of K bits to do so now this is only

play01:45

for when you use a fixed length encoding

play01:47

for each symbol so what happens if we

play01:49

use our little tricks and include

play01:51

shorter encodings for more common

play01:52

symbols and longer encodings for Less

play01:55

common symbols I won't prove all of the

play01:57

math behind it uh may be a good topic

play01:58

for another video but what Claud shanon

play02:00

figured out is the best way to do this

play02:02

is to use an encoding length that's

play02:04

proportionate to the probability that

play02:06

the signal occurs so more specifically

play02:08

if the probability of seeing a

play02:10

particular symbol is p then the encoding

play02:12

length for that symbol should be

play02:13

approximately log 1 over P remember p is

play02:16

a number between 0 and 1 so 1 over p is

play02:19

large for low probability events and

play02:21

close to one for high probability events

play02:23

if you use this optimal encoding for

play02:25

each symbol then the average number of

play02:27

bits you need to send over the wire is

play02:29

the the sum of the probability each

play02:31

symbol occurs times the number of bits

play02:33

you need to encode that symbol in our

play02:35

analogy the Emax commands are the

play02:38

symbols that we want to communicate to

play02:39

the editor our key bindings are a way we

play02:42

encode that information to quickly enter

play02:45

the commands on our keyboard some

play02:46

commands take only one key press or in

play02:49

our analogy one bit some commands take

play02:51

two key presses and some commands take

play02:54

even more what we want to do here is we

play02:56

want to choose an encoding or choose a

play02:58

set of key bindings for our editor so

play03:00

that we can communicate the same

play03:02

commands in the smallest number of key

play03:04

presses so as a brief overview what

play03:06

we're going to be doing is adding a key

play03:08

logger to my emac figuring out the

play03:10

probability that I enter each emac

play03:12

command and then calculating a Huffman

play03:14

encoding so that the number of key

play03:16

presses I need is approximately the

play03:18

optimal value to start out we're going

play03:20

to need to collect some data I added a

play03:22

key logger to my emac config that

play03:24

records the current time major mode and

play03:27

keys pressed for every emac command that

play03:29

I enter I couldn't find a package that

play03:31

logged everything that I wanted so I

play03:33

actually wrote my own but this

play03:35

implementation is Loosely based on

play03:36

keylogger DOL I left this running for

play03:38

over 2 months and I wound up with a text

play03:41

file that was over 100 megab today we're

play03:43

going to look at the data and we're

play03:44

going to sit down and figure out what

play03:46

happened I'll be using closure cuz it's

play03:48

the language I like the most but the

play03:50

analysis that we're going to be doing

play03:51

isn't too complicated so any language

play03:53

implementation will do here before

play03:55

anything else I need to take this file

play03:57

and parse it from a CSV the parser

play03:59

returns each element as a closure list

play04:01

so I'm just going to quickly parse it

play04:03

into a map for easier access now the

play04:05

first thing that I really want to know

play04:06

is how often do I use each key command

play04:09

so I'm just going to grab the

play04:10

frequencies of each command name and

play04:13

quickly look at some of the output here

play04:14

but what I really want is the key

play04:16

commands that I use the most so let's go

play04:18

ahead and let's sort by the number of

play04:20

times each command was seen so now if I

play04:22

go ahead and I run this I can see the

play04:24

command that I use most often is the

play04:26

self-insert command which makes sense

play04:29

because that's literally just typing

play04:31

characters into a buffer this empty

play04:33

string that I don't really understand

play04:35

then there's next line and previous line

play04:38

which is basically just going up and

play04:40

down in a document kind of funny that

play04:42

these are almost exactly the same number

play04:45

and these two commands that I use to

play04:47

turn on and off god mode which is

play04:49

basically just an emex package that

play04:51

gives you a Vim style command mode

play04:53

something that's going to be kind of

play04:54

tricky for us here is that when a key

play04:56

sequence is bound to an anonymous

play04:58

function our little logger just outputs

play05:00

a text representation of the function

play05:02

which as you can see here is pretty hard

play05:04

to understand and follow but if we have

play05:05

to guess what one of these are we're

play05:07

just going to do our best I'm actually

play05:08

really enjoying looking through these

play05:10

just you know for like entertainment so

play05:13

I'm going to go ahead and show off one

play05:14

more page of results you can let me know

play05:16

if you think your commands will look

play05:18

anything like mine do all right now on

play05:20

to the main event we're going to convert

play05:22

these frequencies into a probability

play05:24

distribution and then we're going to

play05:25

calculate the entropy of my emac

play05:27

commands before we start I'm just going

play05:29

to remove self-insert character and the

play05:31

empty string from our analysis because I

play05:33

think it'll make the data harder to

play05:34

analyze I'm also going to remove the

play05:36

next three commands two of which

play05:38

correspond to Mouse movement and one of

play05:40

which is I search print and car which is

play05:42

sort of like a self-insert next I just

play05:45

want to count the number of key command

play05:46

presses left in our data set and then in

play05:48

order to get the probability that I'll

play05:50

enter a particular command I would just

play05:52

do the number of times I see that

play05:54

command divided by the total number of

play05:56

commands entered now I just need to get

play05:58

the probabilities for every Comm command

play05:59

in my data set and I can finally compute

play06:02

the entropy of the commands I enter into

play06:04

the emac editor now what this is saying

play06:07

and this is really interesting is that

play06:09

in order to send all of the commands

play06:11

that I send to the emac editor I need to

play06:13

communicate an average of 6.5 bits of

play06:16

information per command on average I

play06:18

need to type something that's 6.5 bits

play06:21

long now we're making a few assumptions

play06:23

here like that I enter my commands in a

play06:25

random order and that I'll be able to

play06:27

memorize and follow the optimal encoding

play06:29

but in theory it should cost me 6.5 bits

play06:33

but what does that mean in real terms

play06:35

well a standard keyboard has 26 letter

play06:37

keys and 26 other Keys available for use

play06:40

I'm going to assume that we're inside of

play06:42

God Mode's action mode so all keys are

play06:44

available without modifiers and for the

play06:46

keys that do use modifiers I'm going to

play06:48

treat them as two separate characters

play06:50

now we have a really simple model of the

play06:52

world we have 52 different keys and they

play06:55

each represent one symbol that we can

play06:56

send over the wire we can go in reverse

play06:59

to get the number of bits you can encode

play07:01

with 52 different symbols which as we

play07:03

saw earlier is log 2 of 52 or about 5.7

play07:06

what this means is that if we have an

play07:08

optimal encoding we can use an average

play07:10

of one key press one single key press

play07:13

per emac command but we're not done

play07:15

cooking you see information Theory tells

play07:17

us not just how good the optimal

play07:19

encoding is but it also tells us how to

play07:21

get there for this video we're going to

play07:23

be using the python Library da Huffman

play07:25

for our implementation of Huffman

play07:27

encoding so going back to the project

play07:29

I'm just going to import this library

play07:31

and call it on our list of frequencies

play07:33

unfortunately the library Returns the

play07:35

encoding in binary notation rather than

play07:38

keystrokes so we need to do some extra

play07:40

work to convert the binary encodings

play07:41

into key commands basically I did this

play07:44

by assigning each symbol to a binary

play07:46

value and then replacing the binary

play07:48

values in the code with the

play07:49

corresponding symbol so for example

play07:52

00000000 followed by

play07:54

0000001 would be encoded as the key

play07:56

commands AB in this way every sequence

play07:59

of five or fewer binary symbols is

play08:01

encoded as a single key press since 52

play08:03

is not an even power of two there are

play08:05

also some symbols of length six this

play08:08

approach loses a small amount of

play08:09

information but it's guaranteed to be

play08:11

valid because it maintains the invariant

play08:14

that no encoding can be a prefix of

play08:16

another encoding all right so now let's

play08:18

take a moment to look through these I

play08:19

can see that common key commands like

play08:21

end of line new line and save buffer are

play08:24

represented as a single key press and

play08:26

then I can also see that more rarely

play08:27

used commands are given more complicated

play08:29

sequences of keys but as you can already

play08:31

see there's a really obvious problem

play08:33

here these key bindings don't make any

play08:36

sense do you think I'm going to

play08:37

memorize the sequence 2J DQ for a

play08:40

command that I use one time the standard

play08:43

emac key bindings are chosen to be short

play08:45

but they're also chosen to be pneumonic

play08:48

contrl b stands for backwards character

play08:51

contrl F stands for forward character

play08:53

contrl s stands for search the list goes

play08:56

on encoding commands as sequences of

play08:58

bits we produced an efficient encoding

play09:01

but it doesn't provide any value because

play09:03

it loses all of the semantic information

play09:06

of what the key command actually does so

play09:09

now it's time to take our lessons from

play09:10

information Theory and apply them so

play09:13

that we can decrease the amount that we

play09:14

have to type but not increase the amount

play09:17

that we have to memorize you see

play09:19

information Theory provides us with one

play09:21

more key piece of information the length

play09:23

of a particular symbol should be

play09:25

approximately log one over the

play09:28

probability of that symbol occurring

play09:30

this lets us find in a mathematically

play09:32

precise way how far our current key

play09:34

bindings are from the optimal ones to

play09:36

compute this we also need the length of

play09:38

the current key Bindings that we

play09:40

actually use I did log this inside of my

play09:43

logger but it turns out that the

play09:45

internals of God mode mangled the way

play09:47

that my key sequences were logged so I

play09:49

had to get the list of my currently

play09:51

active key bindings by running describe

play09:53

bindings inside of closure mode and then

play09:55

parsing the output then for each command

play09:58

I computed the op optimal length of the

play10:00

key binding for that command so under

play10:02

the ideal encoding how many key presses

play10:05

it should take to enter I compared it to

play10:07

the actual length of the command so the

play10:09

number of key presses that it takes for

play10:12

me to enter that command under my

play10:14

current key bindings now we can go ahead

play10:16

and compute the difference and figure

play10:18

out how far the length of our current

play10:21

key binding is from the ideal value so

play10:23

now I'm just going to run this and I'm

play10:25

going to sort by the commands with the

play10:27

largest difference as you can see this

play10:29

shows us the commands that we have where

play10:31

the key bindings are too long these are

play10:33

the commands that we use constantly but

play10:36

take a tremendous amount of effort to do

play10:38

so I'm looking through this list and

play10:40

it's absolutely wild like forward word

play10:43

backward word cider Jack in every time I

play10:45

see one of these commands I'm like God

play10:47

damn it like obviously I should fix this

play10:49

intuitively I know what this list is

play10:52

these are the same commands that annoy

play10:54

the out of me every day conversely

play10:56

we can also sort by the commands that

play10:58

have the most negative difference these

play11:00

are the commands that have a very short

play11:02

key sequence but that we don't use very

play11:04

often so for example by default control

play11:07

Z is bound to suspend frame which is

play11:09

basically the same as clicking the

play11:11

yellow minus to minimize the window and

play11:13

I pretty much never do this except on

play11:16

accident and then I'm you know annoyed

play11:18

that the window is minimized what this

play11:20

list shows us is what commands we can

play11:22

unbind from their current key bindings

play11:25

taken together these two lists tell us

play11:27

exactly what we need to change and how

play11:30

to change it specifically we need to

play11:32

unbind the key bindings in the bad list

play11:35

and then reassign them to key bindings

play11:37

in the good list so for example I can

play11:40

unbind control Z from suspend frame and

play11:43

then rebind it to save buffer a command

play11:45

that I use much more frequently we can

play11:48

keep most of our current key bindings

play11:50

but rebind all of the worst offenders so

play11:53

that we can happily use emac with the

play11:55

smallest amount of typing all of the

play11:57

code for this video is freely available

play11:59

on GitHub Link in the description so

play12:01

you're welcome to try running this

play12:03

analysis on your own config and see if

play12:05

you get any useful results if you do run

play12:07

this for yourself I'm super curious what

play12:09

happened so if you have any results you

play12:11

want to share definitely feel free to

play12:13

reach out or respond in the comments

play12:15

down below now there's a lot of other

play12:17

things that I wanted to try as part of

play12:18

this video but I didn't get to because I

play12:20

thought it would make the format too

play12:22

long I think the ideas in this video

play12:23

have really interesting implications for

play12:25

general purpose UI design and HCI and I

play12:28

would love to design a website that uses

play12:30

the principles of information Theory

play12:32

along with a lot of logging and

play12:33

analytics to determine how menu items

play12:35

are laid out on the main page and how

play12:37

nestic levels or multi-step menus are

play12:39

organized I also made a lot of

play12:40

approximations during the course of this

play12:42

video one of the biggest being that I

play12:44

did not consider the optimal way to

play12:46

assign key bindings to a multi-step

play12:47

sequence of commands this is super

play12:49

important in taking discrete symbols to

play12:51

their information theoretic limits and

play12:53

also intuitively I think it will

play12:55

represent things that should be emac

play12:57

commands so key sequence that I use

play12:59

frequently that are doing something and

play13:02

could be represented by a higher level

play13:03

concept I'm also planning to repackage a

play13:05

lot of the code in this video into an

play13:07

easy to use library with a nice user

play13:09

interface so if you don't know anything

play13:11

about information Theory you can just

play13:13

run this and see what key bindings you

play13:15

need to change and I'll add all the code

play13:17

for that into the GitHub link that's

play13:19

already in the description anyway if you

play13:21

made it this far give the video a like

play13:23

subscribe to the channel and I'll see

play13:25

you in the next one I also want to thank

play13:27

you guys for helping me reach 1,000

play13:29

subscribers I know it's kind of dumb but

play13:32

it's truly the motivation I need to keep

play13:34

this channel going anyway thanks so much

play13:36

for watching and I'll see you in the

play13:38

next one

Rate This

5.0 / 5 (0 votes)

Related Tags
情報理論Emacキーバインド最適化ハフマン符号化プログラミング効率エンコーディングデータ解析UIデザイン
Do you need a summary in English?