There are a lot of posts all over the internet on GIT-LFS and bitbucket. However most of those posts mention a setting that is not available on the Cloud version of Bitbucket.
So, to give a simple answer to the question "Can I still use GIT-LFS?" ... "Yes, but.."
Why but? Well, it does come with a cost. Actually there is no build-in solution to deal with large files if your repository is hosted on the bitbucket cloud, but the GIT-LFS in general is just a concept, that can be put on every GIT service. The problem is where to put that data?
The answer is any kind of web accessible blob- or file storage. Popular answers in this case are AWS S3 and Azure, but every MongoDB or even a file system is enough (This depends a bit on how reliable it needs to be and how much you are willing to spend).
Now I guess its time to look into what GIT-LFS really is. As we said earlier it is a concept, which is not entirely true, as it is also a protocol. Actually it is able to interfere a push command (with a pre-push hook) exchange the actual file that you'd push with a "link" (an OID basically) and push that to the repository. The real file is then sent to the "lfsurl" specified in the ".git/config" file.
Unfortunately storages like the ones mentioned above are not accepting this protocol calls directly, neither git-lfs can deal with their (complex) authentication mechanisms, so you'll need some sort of interface manager, which is likely to be some sort of server.
Lately I was working on an extension to an existing solution written in Node.JS which tries to give a solution to this problem. It has the ability to talk to those storages mentioned above and implements most of the git-lfs protocol in its current version. All you need to do is putting it on a server, configure it to accept your clients and access your storage and it will work.
As it is an open source project you can contribute in any form if you like.