The hotels I've stayed out with this functionality all had an Aruba Wireless system with a hospitality model AP on a 1-gang wall box somewhere near the floor or strapped to the back of the TV.
Aruba wireless (like many enterprise wireless vendors) has a mDNS proxying and filtering system which is essential for places like college campuses. Airplay and Miracast (infrastructure mode) both use mDNS for service discovery. In Aruba the system is called Airgroup. When seeing a new MAC address the wireless controllers queries the RADIUS server. The RADIUS server can return a configuration for that new MAC address which tells the controller what other devices can interact with this new device using mDNS.
I always assumed this vendor adds and removes Airgroup configuration to allow the phone and the TV to see mdns from each other while preventing this from other nearby devices on the same VLAN or SSID. This trick might be enough but also pushing a DACL to the wireless controller to properly packet filter all network traffic from unapproved devices would strengthen the solution.
The bugzilla tickets linked from that article frustrates me. They should autoplay Yakety Sax music as they dodge around fixing the real @#$@ing bug:
Just copy Chrome and confine all modal dialog boxes such as HTTP basic auth and Javascript alert() to the individual browser tab. No individual tab should every be allowed to pop a modal that prevents interaction with any other tab, any other browser window.
This problem immediately goes away and you don't need to play rate limit wackamole games or do stupid things like have a dialog box that asks if you want to see another modal dialog box.
As someone who interacts with HTTP basic auth frequently Firefox's behavior here is maddening. Fix the bad UI.
> Just copy Chrome and confine all modal dialog boxes such as HTTP basic auth and Javascript alert() to the individual browser tab. No individual tab should every be allowed to pop a modal that prevents interaction with any other tab, any other browser window.
So, `alert()` was fixed about 10 years ago in Firefox.
I'm not sure why the "Authentication Required" dialog wasn't, but I'm willing to bet it's something that was blocked indirectly by the old extension infrastructure (the so-called XUL extensions): until Firefox ~57, huge chunks of the architecture of Firefox were impossible to touch without breaking XUL extensions at a fundamental level, and this included making many things non-blocking.
If I'm right, it's the kind of thing that can now be fixed.
The answer is probably something like "that dialog came from necko, because we didn't put in a good way to propagate blocking prompts back up from the network layer in a way that identifies which tab wants the request".
If extensions were the problem an interface to actually let the extensions work would have been created. As it is you still can't implement a password manager natively.
Wow, yes, having looked at it it's really that simple. All the exploit is doing is triggering a 403 authentication popup. There's even a comment on that bug with the exact scam in it - from two years ago!
In-browser treatment of HTTP auth is just shockingly bad. But Firefox seems to be somewhere you get rewarded for introducing new features rather than fixing bugs.
Funnily enough, in old versions of Firefox (before they deprecated the old plugin system), password managers like Lastpass were able to alter the http-auth pop-up so as to add their functionality to it.
At the time I thought that was cool, and was sad when it went away with the new plugin architecture, but looking back it does indicate quite how bad the situation was with that old plugin format.
This behavior causes another problem as well. I am unable to interact with my password manager plugin when a basic auth dialog is active. I need to remember to manually look up the password and have it in the clipboard before navigating to a site that uses basic auth or else open an additional Firefox window.
Honestly, I'm not sure why browsers don't create a pseudo page setup for things like HTTP auth, JavaScript prompts, etc. Instead of an ugly blocking modal, generate a basic login form/page with nicer styling that doesn't take over the entire window or tab. Like they do for a new tab or what not.
This reminds me of one of my favorite historical audio clips. A few minutes before the 1965 Blackout in NYC the power frequency dipped as low as 51Hz (US is 60Hz). Playback equipment with motors setting speed based on the power frequency sllllooowwwweeeeddd down.
Tons of places outside of F500 are doing that. At work we just disabled IMAP for almost all O365 accounts and any other method of authentication that doesn't implement our two factor authentication. The rising number of compromised accounts because the user used the same password elsewhere, apparent bruteforce attacks, etal forced the issue.
Of course MDM is not required for any of this. Worst case is on Android you're forced to use Microsoft's Outlook app.
I setup Wireguard and was pleased with the setup. Unfortunately I tried to use it over the next week only to quickly realize I should have kept my OpenVPN install instead.
Outbound port filtering is incredibly common on public and guest wifi networks and I found three use cases in my first week where OpenVPN on 443/tcp would have worked fine. The inability to use Wireguard over TCP and bypass most outbound port filtering by using tcp 443/etal makes it unusable in my daily life. I can understand why TCP isn't performant but my choice isn't performance vs non-performant. It's works somewhat vs GFY.
And yes I've seen the udp over tcp forwarding hacks. They don't work on iOS and some look outright dangerous (hello open proxy).
Hopefully this can be addressed before Wireguard hits 1.0.
Have you tried setting up Wireguard to use a different port?
I use port 4500, which is typically used for ipsec nat traversal, and have found it available/worked on most networks where the default wireguard port was filtered.
I'll give 4500 a whirl to see if it increases the success rate. It's a good idea. However it's not inconceivable to have sites like cafes that only allow 80/tcp and 443/tcp because that was an option in the UI on their wifi router for guest networks.
At this point if I was designing a VPN for client devices I'd have a mode that looked at as close to HTTPS as possible. There is one tool to tunnel over websocket but this was already sucking up too much of my play time. :)
Cisco AnyConnect, while expensive and bloated, works great as it initially connects on 443/tcp and then tries to setup UDP. If UDP fails it just sticks with the TCP connection and "just works".
Well, I doubt they'll reject clients based on the use they intend to give to their product, but I can't really fault them for that. It seems like a legitimate business. Hardly unethical.
Perhaps we could imagine better privacy laws. The fact that something is legal, and a legitimate business, doesn't mean that it's good or should be allowed.
Most (all?) of their examples in the FAQ are related to corporate drivers, not individuals. Something like Uber would be a bit of a gray area since you use your own vehicle, but all the others are more like DHL and FedEx, where you're driving a company-provided vehicle and there should be no expectation of privacy in regards to how you operate that vehicle.
Those are the examples given, but that wouldn't explain the huge numbers claimed elsewhere - nearly a quarter of phones in the US. Pardon me for not assuming that a data gathering corporation has my best interests at heart.
"The company doesn't work directly with services such as Uber and Lyft, but a number of apps, such as Sherpashare (which is primarily used by ride-hailing drivers for services like Uber and Lyft), HopSkipDrive, eDriving and a variety of navigation apps use Zendrive's technology to monitor ride safety."
Hurray for malware. What other apps is this hiding in?
The touch bar examples shown are a usability disaster. You're going to hide UI from the screen and make me keep looking at the keyboard to find functionality?
I stopped looking at the keyboard every 10 seconds when I learned how to touch type.
The presenter spent most of his time looking at the keyboard and not the screen.
This gimmick will disappear when Apple decides a touch screen is needed to complete the slow merge with iOS.
>The touch bar examples shown are a usability disaster. You're going to hide UI from the screen and make me keep looking at the keyboard to find functionality?
That's the wrong way to think of it.
It's not a keyboard, it's an adaptive toolbar. And it's close to what professionals in several industries pay handsomely for -- control surfaces, only this one is also adaptive.
>I stopped looking at the keyboard every 10 seconds when I learned how to touch type.
It's obviously NOT meant for typing heavy workloads. Secretaries and programmers coding will not use it when doing their thing.
These professionals in several industries paid handsomely for the Adaptive keys on the 2nd generation Lenovo Thinkpad X1 Carbon. So much so that Lenovo had to remove adaptive keys for the next iterations.
>These professionals in several industries paid handsomely for the Adaptive keys on the 2nd generation Lenovo Thinkpad X1 Carbon. So much so that Lenovo had to remove adaptive keys for the next iterations.
Things are not the same just because they are instances of the same concept.
Lenovo's implementation was black and white, did not have controls such as slide, sweep, multitouch gestures, etc, and was just a novelty from a single Wintel vendor with not much third-party app support or OS support.
Apple's implementation adds deep OS support, will have software vendors onboard (judging from the adoption of older features Apple puts into its hardware/OS, it wont be more than a year when most apps people use will support this, from Photoshop which does already, to the Apple Pro/iLife apps (of course), to Pixelmator, Premiere, and tons of others. It also has color, multitouch/gestures, etc, which increase the utility much more, along with higher definition (for thumbnails and such which it also does).
Also Lenovo's other than that didn't sell much to the creatives market, where such a feature would be appreciated more (if it was done properly of course). Macbook Pro's already sell very well to creatives.
It's the same old story really: more thought out, more integrated, and more supported implementation beats "first to market this general idea" every time.
Eh. In the days when the animals talked, there were keyboards with generic function keys in a top row, with enough space around them for a plastic template sheet labeling the keys with functions for specific software.
Some templates came pre-printed for specific packages, some were blanks for pencil-your-own. Rarely used, but at least those thingies were cheap and did not remove real keys.
Such things were even part of marketing features, lots of old calculators had them. HP calculators even had dedicated modules paired with custom .... stickers (or overlays) to adapt "like" the touchbar.
Everybody wants 'this' yet nothing nailed it. Keyboard are strange beasts, and designers rarely have the right context or experience. A virtual bar is awesome in theory; but the fixed and tactile qualities that seem idiotic today are actually part of the value. Imagine a piano without keys. Even non weighted keys are despised by any musician. Your mind and body have lots of subtle cue sensors to perceive and react accordingly. Anything that contradicts this will fail for refined/industry use.
>Even non weighted keys are despised by any musician.
Not exactly. Jazz and classically trained musicians, yes. Rock, pop, electronica, etc gigging keyboardists play non-weighted keyboards all the time.
>Even non weighted keys are despised by any musician. Your mind and body have lots of subtle cue sensors to perceive and react accordingly. Anything that contradicts this will fail for refined/industry use.
We have already been over this when the smartphones came out without actual keys.
To add to this, I didn't interpret it as "hiding UI" either. I don't think anything will be exclusively available for use on the TouchBar. The way I understood it was the TouchBar will behave like shortcuts to existing functionality.
+1 To me it was more about saving mouse movements than hiding UI. Don't mouse all the way to the top of the screen to click that. Press this dynamic, context aware, button on your toolbar instead. If you'd still rather use hot keys, go for it!
The thing about the Surface Dial is that you don't have to look at it. If you want to press a button on the TouchBar, you have to look at the keyboard.
Doesn't seem like an improvement. F9, F10 and F11 have done this quite well for the last 20 years for me. I can use those blindly while looking at the code and using the mouse to hover variables that I want to inspect...
Every single shortcut shown in the cringe-worthy demos today is already mapped to a well-known keyboard shortcut in the app. And pro users can use those shortcuts without ever having to look down at the keyboard.
Final Cut demo claimed that "some of these shortcuts are hard to find in the menus of an advanced app" whereas:
1. the touchbar was displaying the simplest of shortcuts, many of which are already mapped to keyboard shortcuts that professionals use
2. menus in MacOS have a built in search which touchbar lacks, obviously
3. the very next demo of Photoshop showed touchbar shortcuts nested two layers deep and are basically undiscoverable.
What the hell is touchbar if not a totally useless gimmick for actual pro users of pro apps?
>Every single shortcut shown in the cringe-worthy demos today is already mapped to a well-known keyboard shortcut in the app. And pro users can use those shortcuts without ever having to look down at the keyboard.
I'm a pro user (of NLEs) and I don't "use those shortcuts without having to look down at the keyboard". Only a few of them.
And it's not because I don't do shortcuts in general (I've used Vim for over 20 years in all its glory).
Profesionals use external interfaces all the time, e.g. for color correction, editing etc. While this doesn't fully replace this, it's more than adequate for a lot of what those do, and perfect for editing on the field.
It's also not about those "keyboard shortcuts". Flipping through movie frames with variable speed is not a shortcut. Applying filter resonance on a DAW is not done with a shortcut. Heck, this can even change several items together at the same time (e.g. 2 virtual sliders etc). There are literally tons of other things we now use sliders, dials, etc for in Pro programs, and which arbitrary speed and "jump to place" (not just "one click at a time" as shortcuts offer) will be great.
>What the hell is touchbar if not a totally useless gimmick for actual pro users of pro apps?
It also has no wifi and less space than a Nomad from what I heard. Lame.
Anyway, let's give it a year and we'll see how many pro's swear by it.
> Profesionals use external interfaces all the time
Hence the need for the touchbar is further greatly reduced
> It's also not about those "keyboard shortcuts"... There are literally tons of other things we now use sliders, dials, etc for in Pro programs, and which arbitrary speed and "jump to place"
Which are delegated to a tiny strip in an awkward location (notice how carefully everyone is holding their fingers at ~90 degrees to the keyboard) controlled by very imprecise finger movements
> Anyway, let's give it a year and we'll see how many pro's swear by it.
I can agree on that :) We need more actual field experience
> Which are delegated to a tiny strip in an awkward location
Yeah, it seems like it might have been more useful to put it on the side of the keyboard. And they wouldn't have even needed to get rid of escape and the F-keys then.
Well obviously it's not for coding, but I understand the one you're replying to perfectly. And I think there are hordes of developpers out there coding daily on their MBP. Which are now sort of left alone. Anecdote, though I doubt I'll be the only one for which the lack of F keys is a showstopper - also see other HN comments: after about 5 years on a Dell I was thingking maybe this year to go back to a MBP (even though my last one lasted only 3 years) but seeing I spend a lot of time debugging, which translates to repeatedly hitting F-keys which is burnt into muscle memory, it's just not going to happen. Even if I'd learn to do it with that bar thingie, it's a bit of a waste of time since going back to any other machine the bar isn't there.
Coders don't just write code (no, you don't need F keys for that and yes your hands kan stay on the home row) they also debug it (for which the environments I, and seemingly others, use, assigned the F keys many many years ago)
I always forget which is which. Now looking at the chrome debugger F8 is used for Pause/Resume. Step Over F10, Step into F11, Step out Shift F11. Hence I use the mouse.
As an occasional debugging user it would be a great addition to have this on the keyboard.
I wonder if it would have been better to just allow different images to be displayed on the existing hardware keys. That way, you have physical keys, and can use them without looking, but when you're in an app that you're unfamiliar with, you can look and see what the keys are for.
I always use those keys on a physical keyboard while looking at the display, but with a touchscreen, I always have to look where I'm pressing keys, and even then my fingers are sometimes slightly wrong.
I have something like a touch bar on my Windows laptop. The main problem I have with it is that I touch it without knowing I have. Many a time I've gone down a rabbit hole of WIFI network issues to only discover that I've accidentally toggled WIFI off by brushing the bar.
This is one feature on my Windows laptop I don't like and not excited for a new Mac with one.
If they use haptic feedback it might not be so bad. Plus, apple controls the OS and hardware together, so they can ensure notifications must be shown for behavior like turning off WIFI or something similar.
When Schiller showed how MS Office worked with the touch bar, I thought it was pretty hilarious.
All of the toolbar buttons and more were already on the main screen. If they had offered a touch screen instead, the user wouldn't have to take his/her eyes off the screen to perform the same functions. And that feature is already available on Windows machines with touch screens.
>If they had offered a touch screen instead, the user wouldn't have to take his/her eyes off the screen to perform the same functions.
If they had "offered a touch screen instead", the users would have to hold their hand horizontally which would get tiring in about 5 minutes and unbearable after 10.
Not to mention that the on-screen tools were and will still be there anyway if one wants to use them with the traditional, and not challenging to the muscles, mouse and touchpad technology...
>> If they had "offered a touch screen instead", the users would have to hold their hand horizontally which would get tiring in about 5 minutes and unbearable after 10.
Why do so many Mac users think the presence of a touch screen all of a sudden means using it full time?
I got a Surface Pro when I was still a predominantly Mac user and I only used the touch screen when appropriate, which was for a fraction of a second every now and then. And it worked great.
Using a touchscreen is definitely better than a touchpad in a lot of scenarios because of the ability to directly manipulate content instead of having to navigate to it.
>Why do so many Mac users think the presence of a touch screen all of a sudden means using it full time?
Who said full time? Full time it would be unbearable in 1 minute. I said unbearable in 10 minutes with casual use in mind: raising your hand now and then to click this or that button.
>Just like the bigger screen iPhone, which was supposedly a usability disaster because you couldn't reach the whole screen with a single finger.
There's this idea that people will buy everything Apple does, or that people put down things until Apple does them.
First, Apple started from nearly bankrupt in 1997 and so. People clearly didn't just buy what Apple put out. What build Apple's fortune was not some hundreds of millions of "sheep" who magically bought everything Apple put out, but a steady stream of products that increasingly more people bought.
People didn't just buy everything Apple put out. People had to be convinced, from the first post-jobs stuff (Cube, iMac) to start buying them -- increasingly over time. The iPod took years to really take off, for example.
Along with this idea, there's this other idea that Apple 'fans' will put down a feature until Apple releases something that has it.
Those saying that seem to forget that the internet is full of people, and the people who said "the iPhone doesn't need a bigger screen" are not necessarily the ones who bought one after Apple released it.
Apple still sells tens of millions of non-plus iPhones, to people who don't really like the large screen, and even put out the old 4" model again (iPhone SE).
I got a Plus and it's still a usability disaster on that regard. One hand use (which is what I mostly used to do on the smaller models) is pretty much out. But I coped mostly because I appreciate the extra ~50mm camera lens, and for reading stuff while commuting etc (with two hands).
>> There's this idea that people will buy everything Apple does, or that people put down things until Apple does them.
Part of the problem is that Apple / Steve Jobs sometimes helped perpetuate this idea.
Some examples:
* Steve Jobs saying video on an iPod type of device was pointless
* Steve Jobs saying that apps were pointless on the iPhone and that the web apps were good enough
* Apple putting out a TV ad emphasizing that the iPhone had the perfect screen size because your finger could go from end to end
Apple dislikers predictably misinterpret this as Apple taking an ideological position when all Apple is really doing is just trying to control the marketing message on what they think really matters with the current release of a particular product.
Maybe it's because I've gone for the biggest phones as daily drivers for a while now (Note series => Nexus 6 => Plus), but 1 hand is a total non-issue for me.
I just shift the phone around in my palm. Even with an oversized Otterbox like case I put it in from time to time I'm able to reach 80% of the screen with my thumb, and I reach around with my other 4 fingers to reach the rest of it.
Pretty similar to how I use/used my Nexus 6 one handed, and it was a little bigger than the Plus
This isn't how you use a touch screen on a laptop. I don't know why you're so insistent on this narrative but it's false. Adding a touch screen doesn't turn it into a tablet, it turns it into a laptop with a touch screen. That's a very important distinction.
You would do well to try one out for a few minutes with a typical workflow of...well, whatever you do on a computer. I thought it was stupid at first and now I miss it when I use my MacBook.
After getting a Surface Pro, I would constantly start swiping up and down on my Macbook Pro's screen thinking I could scroll. I still do that on other non-touch laptop screens too.
>Adding a touch screen doesn't turn it into a tablet, it turns it into a laptop with a touch screen. That's a very important distinction.
That's my point exactly. A laptop with a touch screen is not convenient, unless it can be somehow turned into a tabled (Surface does it IIRC).
>You would do well to try one out for a few minutes with a typical workflow of...well, whatever you do on a computer.
Tried a few times to get a sense of how it would be during this thread. It doesn't work for me at all (at least when sitting on the desk with the laptop). I don't want to raise my heads and hold them to the screen, and I'm not that hot into touching the screen with my fingers either.
>> Tried a few times to get a sense of how it would be during this thread.
On a real touchscreen laptop or simulating on a non-touchscreen laptop?
I think it makes a real difference to sit down with an open mind and go through some real use cases on a laptop with a working touchscreen (and the software you'd actually use). You'll realize that you're only using when it is appropriate for only a few seconds at a time, if that. And there will probably be huge time gaps (hours, days even) where you might not use it. But in the odd time when you need it, you're often glad that it's there.
I don't get how it can be unbearable in 10 minutes on a laptop. It's not going to be much further away from you than a tablet would be, and people love them their iPads. If the experience was -that- bad, nobody would be buying iPad Pros.
And it's not as though you can't continue to use the keyboard, touchpad and/or mouse.
I'll admit, when the Surface Pro first came out, I mocked the touchscreen too. I thought it would create Gorilla Arm. Then I actually got one and used it for an extended period of time. You don't even notice that you're touching the screen.
>I don't get how it can be unbearable in 10 minutes on a laptop. It's not going to be much further away from you than a tablet would be, and people love them their iPads. If the experience was -that- bad, nobody would be buying iPad Pros.
It's about the orientation, not the touch screen itself. A tablet you normally hold horizontally or at an angle when you use it.
Now, if it was a detachable screen, like the Surface, that could work, but a laptop screen you have vertical to the keyboard.
>> It's about the orientation, not the touch screen itself. A tablet you normally hold horizontally or at an angle when you use it.
I get that it might not be for everyone, but a LOT of keyboard cases have been sold to iPad and iPad Pro users (and many of these cases prop up the iPad screen vertically). And keep in mind that iOS pretty much necessitates use of the touch screen more than Windows 8/10 does. So there must be a LOT of people who would be OK with that mode of use.
With respect to the Surface, I use it with the keyboard attached 99% of the time so it's pretty much vertical all the time. It's not even remotely uncomfortable or tiring in normal use.
>I get that it might not be for everyone, but a LOT of keyboard cases have been sold to iPad and iPad Pro users (and many of these cases prop up the iPad screen vertically).
Yes, but those are for writing -- ie. using the iPad laptop style. Not for doing work e.g. graphics, etc with the iPad vertically held.
>You aren't really suggesting that writing is not work, are you?
No, I wrote "E.g. graphics" as a parenthetical expression to give an example of the kind of work they dont use those cases for. That is, what I wrote amounts to:
"Yes, but those [cases] are for writing -- ie. using the iPad laptop style. [They are not using the cases] for doing work [like graphics] with the iPad vertically held".
>And why would Schiller make a point of mentioning the touch bar integration with Office and iWork?
Because Schiller made this point about a laptop, and even more so a laptop with a flat horizontal strip.
Whereas what I said is that it's tedious for people to do that (touch interaction) on a vertical screen. In Schiller's example there's Office and iWork but no vertical touch screen -- just the strip, and the regular screen you handle with the mouse/trackpad (that is, without having your hands in the air to touch the screen).
Different strokes for different folks, I guess, but in my use of touch on a vertically oriented screen, it hasn't been negative at all.
There's something to be said about being able to use two fingers to directly manipulate a window's contents to zoom an image or vector diagram in an app to get to the precise size versus futzing around with control + or control - using the app's preset increments.
Yeah, I feel as if people try to hate touch screens because Apple doesn't think its right. Even though iPad users with keyboards use the interface. It requires a transition period but it becomes second nature after a while.
One of the important things for me with something like the surface is when you finally dock it to use a full setup you don't lose any capability. What happens when you do this with the new mbp? Is a new apple keyboard in the works?
I was considering getting the new mbp this time around thinking this was going to be an enhanced media button bar. But now I'm definitely back on the sidelines
> If they had "offered a touch screen instead", the users would have to hold their hand horizontally which would get tiring in about 5 minutes and unbearable after 10.
This is complete nonsense. I used to think this when I used a MacBook full time. I still do but buying a touchscreen PC has worked wonderfully and I never hover my hand long enough to do anything tiring. It's great for doing quick touches or scrolling / zooming in with precision. You wouldn't want to use it for everything because that's not its use case (this isn't a dedicated tablet this is a computer with a touch screen).
>This is complete nonsense. I used to think this when I used a MacBook full time. I still do but buying a touchscreen PC has worked wonderfully and I never hover my hand long enough to do anything tiring.
I never hover my hand period, so that's one better. Not that hot about touching the screen either...
>It's great for doing quick touches or scrolling / zooming in with precision.
With my hands on the keyboard and my thumbs close to the touchpad, I don't see the appeal of suddenly raising my hands to the screen, especially for general things like zooming, scrolling. It could make sense to manipulate something like an object directly, but to raise hands just for zooming or scrolling?
>> With my hands on the keyboard and my thumbs close to the touchpad, I don't see the appeal of suddenly raising my hands to the screen, especially for general things like zooming, scrolling. It could make sense to manipulate something like an object directly, but to raise hands just for zooming or scrolling?
I think I figured it out. You're spoiled by the quality of the Apple touchpads. A few minutes with a crappy Windows touchpad and you'll be wishing you could touch the screen.
FWIW, I hate all touchpads. I just hate Apple touchpads the least.
>I think I figured it out. You're spoiled by the quality of the Apple touchpads. A few minutes with a crappy Windows touchpad and you'll be wishing you could touch the screen.
Hmm, you might be on to something here. I'm so used to the Mac trackpads, than I almost never use a mouse with them (except for heavy illustrator/photoshop stuff), whereas with my Windows laptops I always use a mouse.
>FWIW, I hate all touchpads. I just hate Apple touchpads the least.
I hate the IBM's/Lenovo's red-nipple thing with the same passion!
>If they had "offered a touch screen instead", the users would have to hold their hand horizontally which would get tiring in about 5 minutes and unbearable after 10.
Lifting my arm up to the screen every time I want to use something from the toolbar? No thank you. I already hate lifting my hands from the keyboard to use a trackpad.
Except all of those which use an F-key are broken now. (You might interject that Mac OS itself does not use F-keys in shortcuts, but Linux and Windows apps in VMs do.)
This is not for when you are in "get stuff down quickly" mode. They might have predictive text, but I don't think anyone will use it. For me it's when I'm editing images in Lightroom I wish I could do more stuff full screen so that I don't have the visual clutter in the way.
This. Lightroom currently doesn't make it possible to view a photo fullscreen while at the same time having editing controls. A touch screen is also not ideal, because then your hand covers the image. The touch bar seems to make using adjustment controls in full-screen really easy.
All the keys will be virtual, and they will have haptic feedback. The shallow key travel now is to prepare people, similar to the coke <> new coke conspiracy.
Then they will make the keyboard half of the laptop an ipad. Then they will release two ipads that hinge together. Then they will figure out that ios/macos dichotomy is causing problems and finally go through with some hybridization.
Wouldn't count on that. I'm pretty sure for using the touchbar you have to look down and aim while you press the thing which takes some time. Compared to that you most likely already blindly can grab your mouse or find the touchpad and move the cursor in the fraction of a second.
I think it has the potential to be an interesting alternative to a touch screen. Some of the examples did look modestly useful. I will, however, miss the volume and music forward/back keys which I actively use. Like anything else, I'll have to wait and see how it works in practice.
I've been holding off on updating my macbook pro, but I think I'm going to skip this generation and buy used. The lack of an nvidia gpu is a giant pita. You never get good cuda performance on a laptop, but it's nice to be able to test code on one.
If they expand up to 32g ram though I'm buying one the second they go live on the site.
I waited until the new pages were up specifically to see that they are still using LPDDR3. They left this important info out of their keynote, only mentioning "2133Mhz".
Alas it's probably true. Albeit, maybe with time space memory kicks in if the app uses the bar in order to promote really important "instructions". I didn't see any mention of tactile feedback, which is sad. The idea of a more generic dynamic keyboard will have to wait for a v2, or maybe longer.
Exactly, I want to look at the screen continuously with my fingers feeling the keyboard. Too much eye movement and distraction from the main task. Personally, this will just slow me down.
Looks like they are using crappy dual core Ultrabook processors and calling it a pro notebook. As per the frequency of processor its looks like i5-6267U for 13inch touch bar and i5-6360u for non-touch bar. I would instead buy this laptop with i7-6820HQ for much cheaper.
That's absurd. There's no way they're using those processors and getting the performance improvements they described. It's not unheard of for Apple to get custom parts, or over/underclock the cpu.
Agreed, but what if something else actually happened. Apple wants to broaden its Apple Pay integration. They decide to add touch id, and adding a black glossy button on the side throws of the aesthetic. So instead, they replace the entire bar.
You can pin radio stations. Unfortunately it only grabs a handful of songs at the top of the playlist. So dumb.
Edit: Radio station downloading now seems to grab at least twice the amount it used to. Which puts it firmly in the "meh" category when it comes to usefulness.
Aruba wireless (like many enterprise wireless vendors) has a mDNS proxying and filtering system which is essential for places like college campuses. Airplay and Miracast (infrastructure mode) both use mDNS for service discovery. In Aruba the system is called Airgroup. When seeing a new MAC address the wireless controllers queries the RADIUS server. The RADIUS server can return a configuration for that new MAC address which tells the controller what other devices can interact with this new device using mDNS.
I always assumed this vendor adds and removes Airgroup configuration to allow the phone and the TV to see mdns from each other while preventing this from other nearby devices on the same VLAN or SSID. This trick might be enough but also pushing a DACL to the wireless controller to properly packet filter all network traffic from unapproved devices would strengthen the solution.